Nov 25 18:07:34 localhost kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 25 18:07:34 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 25 18:07:34 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 25 18:07:34 localhost kernel: BIOS-provided physical RAM map:
Nov 25 18:07:34 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 25 18:07:34 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 25 18:07:34 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 25 18:07:34 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 25 18:07:34 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 25 18:07:34 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 25 18:07:34 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 25 18:07:34 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 25 18:07:34 localhost kernel: NX (Execute Disable) protection: active
Nov 25 18:07:34 localhost kernel: APIC: Static calls initialized
Nov 25 18:07:34 localhost kernel: SMBIOS 2.8 present.
Nov 25 18:07:34 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 25 18:07:34 localhost kernel: Hypervisor detected: KVM
Nov 25 18:07:34 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 25 18:07:34 localhost kernel: kvm-clock: using sched offset of 4127296481 cycles
Nov 25 18:07:34 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 25 18:07:34 localhost kernel: tsc: Detected 2800.000 MHz processor
Nov 25 18:07:34 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 25 18:07:34 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 25 18:07:34 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 25 18:07:34 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 25 18:07:34 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 25 18:07:34 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 25 18:07:34 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 25 18:07:34 localhost kernel: Using GB pages for direct mapping
Nov 25 18:07:34 localhost kernel: RAMDISK: [mem 0x2ed25000-0x3368afff]
Nov 25 18:07:34 localhost kernel: ACPI: Early table checksum verification disabled
Nov 25 18:07:34 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 25 18:07:34 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 18:07:34 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 18:07:34 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 18:07:34 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 25 18:07:34 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 18:07:34 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 25 18:07:34 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 25 18:07:34 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 25 18:07:34 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 25 18:07:34 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 25 18:07:34 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 25 18:07:34 localhost kernel: No NUMA configuration found
Nov 25 18:07:34 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 25 18:07:34 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Nov 25 18:07:34 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 25 18:07:34 localhost kernel: Zone ranges:
Nov 25 18:07:34 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 25 18:07:34 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 25 18:07:34 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 25 18:07:34 localhost kernel:   Device   empty
Nov 25 18:07:34 localhost kernel: Movable zone start for each node
Nov 25 18:07:34 localhost kernel: Early memory node ranges
Nov 25 18:07:34 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 25 18:07:34 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 25 18:07:34 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 25 18:07:34 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 25 18:07:34 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 25 18:07:34 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 25 18:07:34 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 25 18:07:34 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 25 18:07:34 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 25 18:07:34 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 25 18:07:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 25 18:07:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 25 18:07:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 25 18:07:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 25 18:07:34 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 25 18:07:34 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 25 18:07:34 localhost kernel: TSC deadline timer available
Nov 25 18:07:34 localhost kernel: CPU topo: Max. logical packages:   8
Nov 25 18:07:34 localhost kernel: CPU topo: Max. logical dies:       8
Nov 25 18:07:34 localhost kernel: CPU topo: Max. dies per package:   1
Nov 25 18:07:34 localhost kernel: CPU topo: Max. threads per core:   1
Nov 25 18:07:34 localhost kernel: CPU topo: Num. cores per package:     1
Nov 25 18:07:34 localhost kernel: CPU topo: Num. threads per package:   1
Nov 25 18:07:34 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 25 18:07:34 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 25 18:07:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 25 18:07:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 25 18:07:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 25 18:07:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 25 18:07:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 25 18:07:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 25 18:07:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 25 18:07:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 25 18:07:34 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 25 18:07:34 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 25 18:07:34 localhost kernel: Booting paravirtualized kernel on KVM
Nov 25 18:07:34 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 25 18:07:34 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 25 18:07:34 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 25 18:07:34 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 25 18:07:34 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 25 18:07:34 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 25 18:07:34 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 25 18:07:34 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 25 18:07:34 localhost kernel: random: crng init done
Nov 25 18:07:34 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 25 18:07:34 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 25 18:07:34 localhost kernel: Fallback order for Node 0: 0 
Nov 25 18:07:34 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 25 18:07:34 localhost kernel: Policy zone: Normal
Nov 25 18:07:34 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 25 18:07:34 localhost kernel: software IO TLB: area num 8.
Nov 25 18:07:34 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 25 18:07:34 localhost kernel: ftrace: allocating 49313 entries in 193 pages
Nov 25 18:07:34 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 25 18:07:34 localhost kernel: Dynamic Preempt: voluntary
Nov 25 18:07:34 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 25 18:07:34 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 25 18:07:34 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 25 18:07:34 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 25 18:07:34 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 25 18:07:34 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 25 18:07:34 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 25 18:07:34 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 25 18:07:34 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 25 18:07:34 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 25 18:07:34 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 25 18:07:34 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 25 18:07:34 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 25 18:07:34 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 25 18:07:34 localhost kernel: Console: colour VGA+ 80x25
Nov 25 18:07:34 localhost kernel: printk: console [ttyS0] enabled
Nov 25 18:07:34 localhost kernel: ACPI: Core revision 20230331
Nov 25 18:07:34 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 25 18:07:34 localhost kernel: x2apic enabled
Nov 25 18:07:34 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 25 18:07:34 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 25 18:07:34 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Nov 25 18:07:34 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 25 18:07:34 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 25 18:07:34 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 25 18:07:34 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 25 18:07:34 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 25 18:07:34 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 25 18:07:34 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 25 18:07:34 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 25 18:07:34 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 25 18:07:34 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 25 18:07:34 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 25 18:07:34 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 25 18:07:34 localhost kernel: x86/bugs: return thunk changed
Nov 25 18:07:34 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 25 18:07:34 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 25 18:07:34 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 25 18:07:34 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 25 18:07:34 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 25 18:07:34 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 25 18:07:34 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 25 18:07:34 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 25 18:07:34 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 25 18:07:34 localhost kernel: landlock: Up and running.
Nov 25 18:07:34 localhost kernel: Yama: becoming mindful.
Nov 25 18:07:34 localhost kernel: SELinux:  Initializing.
Nov 25 18:07:34 localhost kernel: LSM support for eBPF active
Nov 25 18:07:34 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 25 18:07:34 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 25 18:07:34 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 25 18:07:34 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 25 18:07:34 localhost kernel: ... version:                0
Nov 25 18:07:34 localhost kernel: ... bit width:              48
Nov 25 18:07:34 localhost kernel: ... generic registers:      6
Nov 25 18:07:34 localhost kernel: ... value mask:             0000ffffffffffff
Nov 25 18:07:34 localhost kernel: ... max period:             00007fffffffffff
Nov 25 18:07:34 localhost kernel: ... fixed-purpose events:   0
Nov 25 18:07:34 localhost kernel: ... event mask:             000000000000003f
Nov 25 18:07:34 localhost kernel: signal: max sigframe size: 1776
Nov 25 18:07:34 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 25 18:07:34 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 25 18:07:34 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 25 18:07:34 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 25 18:07:34 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 25 18:07:34 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 25 18:07:34 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Nov 25 18:07:34 localhost kernel: node 0 deferred pages initialised in 9ms
Nov 25 18:07:34 localhost kernel: Memory: 7776572K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 605568K reserved, 0K cma-reserved)
Nov 25 18:07:34 localhost kernel: devtmpfs: initialized
Nov 25 18:07:34 localhost kernel: x86/mm: Memory block size: 128MB
Nov 25 18:07:34 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 25 18:07:34 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 25 18:07:34 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 25 18:07:34 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 25 18:07:34 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 25 18:07:34 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 25 18:07:34 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 25 18:07:34 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 25 18:07:34 localhost kernel: audit: type=2000 audit(1764094052.431:1): state=initialized audit_enabled=0 res=1
Nov 25 18:07:34 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 25 18:07:34 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 25 18:07:34 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 25 18:07:34 localhost kernel: cpuidle: using governor menu
Nov 25 18:07:34 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 25 18:07:34 localhost kernel: PCI: Using configuration type 1 for base access
Nov 25 18:07:34 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 25 18:07:34 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 25 18:07:34 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 25 18:07:34 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 25 18:07:34 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 25 18:07:34 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 25 18:07:34 localhost kernel: Demotion targets for Node 0: null
Nov 25 18:07:34 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 25 18:07:34 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 25 18:07:34 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 25 18:07:34 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 25 18:07:34 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 25 18:07:34 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 25 18:07:34 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 25 18:07:34 localhost kernel: ACPI: Interpreter enabled
Nov 25 18:07:34 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 25 18:07:34 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 25 18:07:34 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 25 18:07:34 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 25 18:07:34 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 25 18:07:34 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 25 18:07:34 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [3] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [4] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [5] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [6] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [7] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [8] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [9] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [10] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [11] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [12] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [13] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [14] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [15] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [16] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [17] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [18] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [19] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [20] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [21] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [22] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [23] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [24] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [25] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [26] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [27] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [28] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [29] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [30] registered
Nov 25 18:07:34 localhost kernel: acpiphp: Slot [31] registered
Nov 25 18:07:34 localhost kernel: PCI host bridge to bus 0000:00
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 25 18:07:34 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 25 18:07:34 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 25 18:07:34 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 25 18:07:34 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 25 18:07:34 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 25 18:07:34 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 25 18:07:34 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 25 18:07:34 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 25 18:07:34 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 25 18:07:34 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 25 18:07:34 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 25 18:07:34 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 25 18:07:34 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 25 18:07:34 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 25 18:07:34 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 25 18:07:34 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 25 18:07:34 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 25 18:07:34 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 25 18:07:34 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 25 18:07:34 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 25 18:07:34 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 25 18:07:34 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 25 18:07:34 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 25 18:07:34 localhost kernel: iommu: Default domain type: Translated
Nov 25 18:07:34 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 25 18:07:34 localhost kernel: SCSI subsystem initialized
Nov 25 18:07:34 localhost kernel: ACPI: bus type USB registered
Nov 25 18:07:34 localhost kernel: usbcore: registered new interface driver usbfs
Nov 25 18:07:34 localhost kernel: usbcore: registered new interface driver hub
Nov 25 18:07:34 localhost kernel: usbcore: registered new device driver usb
Nov 25 18:07:34 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 25 18:07:34 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 25 18:07:34 localhost kernel: PTP clock support registered
Nov 25 18:07:34 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 25 18:07:34 localhost kernel: NetLabel: Initializing
Nov 25 18:07:34 localhost kernel: NetLabel:  domain hash size = 128
Nov 25 18:07:34 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 25 18:07:34 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 25 18:07:34 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 25 18:07:34 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 25 18:07:34 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 25 18:07:34 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 25 18:07:34 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 25 18:07:34 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 25 18:07:34 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 25 18:07:34 localhost kernel: vgaarb: loaded
Nov 25 18:07:34 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 25 18:07:34 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 25 18:07:34 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 25 18:07:34 localhost kernel: pnp: PnP ACPI init
Nov 25 18:07:34 localhost kernel: pnp 00:03: [dma 2]
Nov 25 18:07:34 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 25 18:07:34 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 25 18:07:34 localhost kernel: NET: Registered PF_INET protocol family
Nov 25 18:07:34 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 25 18:07:34 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 25 18:07:34 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 25 18:07:34 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 25 18:07:34 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 25 18:07:34 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 25 18:07:34 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 25 18:07:34 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 25 18:07:34 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 25 18:07:34 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 25 18:07:34 localhost kernel: NET: Registered PF_XDP protocol family
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 25 18:07:34 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 25 18:07:34 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 25 18:07:34 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 25 18:07:34 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 90851 usecs
Nov 25 18:07:34 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 25 18:07:34 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 25 18:07:34 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 25 18:07:34 localhost kernel: ACPI: bus type thunderbolt registered
Nov 25 18:07:34 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 25 18:07:34 localhost kernel: Initialise system trusted keyrings
Nov 25 18:07:34 localhost kernel: Key type blacklist registered
Nov 25 18:07:34 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 25 18:07:34 localhost kernel: zbud: loaded
Nov 25 18:07:34 localhost kernel: integrity: Platform Keyring initialized
Nov 25 18:07:34 localhost kernel: integrity: Machine keyring initialized
Nov 25 18:07:34 localhost kernel: Freeing initrd memory: 75160K
Nov 25 18:07:34 localhost kernel: NET: Registered PF_ALG protocol family
Nov 25 18:07:34 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 25 18:07:34 localhost kernel: Key type asymmetric registered
Nov 25 18:07:34 localhost kernel: Asymmetric key parser 'x509' registered
Nov 25 18:07:34 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 25 18:07:34 localhost kernel: io scheduler mq-deadline registered
Nov 25 18:07:34 localhost kernel: io scheduler kyber registered
Nov 25 18:07:34 localhost kernel: io scheduler bfq registered
Nov 25 18:07:34 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 25 18:07:34 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 25 18:07:34 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 25 18:07:34 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 25 18:07:34 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 25 18:07:34 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 25 18:07:34 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 25 18:07:34 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 25 18:07:34 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 25 18:07:34 localhost kernel: Non-volatile memory driver v1.3
Nov 25 18:07:34 localhost kernel: rdac: device handler registered
Nov 25 18:07:34 localhost kernel: hp_sw: device handler registered
Nov 25 18:07:34 localhost kernel: emc: device handler registered
Nov 25 18:07:34 localhost kernel: alua: device handler registered
Nov 25 18:07:34 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 25 18:07:34 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 25 18:07:34 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 25 18:07:34 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 25 18:07:34 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 25 18:07:34 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 25 18:07:34 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 25 18:07:34 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 25 18:07:34 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 25 18:07:34 localhost kernel: hub 1-0:1.0: USB hub found
Nov 25 18:07:34 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 25 18:07:34 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 25 18:07:34 localhost kernel: usbserial: USB Serial support registered for generic
Nov 25 18:07:34 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 25 18:07:34 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 25 18:07:34 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 25 18:07:34 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 25 18:07:34 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 25 18:07:34 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 25 18:07:34 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-25T18:07:33 UTC (1764094053)
Nov 25 18:07:34 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 25 18:07:34 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 25 18:07:34 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 25 18:07:34 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 25 18:07:34 localhost kernel: usbcore: registered new interface driver usbhid
Nov 25 18:07:34 localhost kernel: usbhid: USB HID core driver
Nov 25 18:07:34 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 25 18:07:34 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 25 18:07:34 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 25 18:07:34 localhost kernel: Initializing XFRM netlink socket
Nov 25 18:07:34 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 25 18:07:34 localhost kernel: Segment Routing with IPv6
Nov 25 18:07:34 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 25 18:07:34 localhost kernel: mpls_gso: MPLS GSO support
Nov 25 18:07:34 localhost kernel: IPI shorthand broadcast: enabled
Nov 25 18:07:34 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 25 18:07:34 localhost kernel: AES CTR mode by8 optimization enabled
Nov 25 18:07:34 localhost kernel: sched_clock: Marking stable (1255002190, 148658020)->(1484559870, -80899660)
Nov 25 18:07:34 localhost kernel: registered taskstats version 1
Nov 25 18:07:34 localhost kernel: Loading compiled-in X.509 certificates
Nov 25 18:07:34 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 25 18:07:34 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 25 18:07:34 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 25 18:07:34 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 25 18:07:34 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 25 18:07:34 localhost kernel: Demotion targets for Node 0: null
Nov 25 18:07:34 localhost kernel: page_owner is disabled
Nov 25 18:07:34 localhost kernel: Key type .fscrypt registered
Nov 25 18:07:34 localhost kernel: Key type fscrypt-provisioning registered
Nov 25 18:07:34 localhost kernel: Key type big_key registered
Nov 25 18:07:34 localhost kernel: Key type encrypted registered
Nov 25 18:07:34 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 25 18:07:34 localhost kernel: Loading compiled-in module X.509 certificates
Nov 25 18:07:34 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 25 18:07:34 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 25 18:07:34 localhost kernel: ima: No architecture policies found
Nov 25 18:07:34 localhost kernel: evm: Initialising EVM extended attributes:
Nov 25 18:07:34 localhost kernel: evm: security.selinux
Nov 25 18:07:34 localhost kernel: evm: security.SMACK64 (disabled)
Nov 25 18:07:34 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 25 18:07:34 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 25 18:07:34 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 25 18:07:34 localhost kernel: evm: security.apparmor (disabled)
Nov 25 18:07:34 localhost kernel: evm: security.ima
Nov 25 18:07:34 localhost kernel: evm: security.capability
Nov 25 18:07:34 localhost kernel: evm: HMAC attrs: 0x1
Nov 25 18:07:34 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 25 18:07:34 localhost kernel: Running certificate verification RSA selftest
Nov 25 18:07:34 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 25 18:07:34 localhost kernel: Running certificate verification ECDSA selftest
Nov 25 18:07:34 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 25 18:07:34 localhost kernel: clk: Disabling unused clocks
Nov 25 18:07:34 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 25 18:07:34 localhost kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 25 18:07:34 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 25 18:07:34 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 25 18:07:34 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 25 18:07:34 localhost kernel: Run /init as init process
Nov 25 18:07:34 localhost kernel:   with arguments:
Nov 25 18:07:34 localhost kernel:     /init
Nov 25 18:07:34 localhost kernel:   with environment:
Nov 25 18:07:34 localhost kernel:     HOME=/
Nov 25 18:07:34 localhost kernel:     TERM=linux
Nov 25 18:07:34 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64
Nov 25 18:07:34 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 25 18:07:34 localhost systemd[1]: Detected virtualization kvm.
Nov 25 18:07:34 localhost systemd[1]: Detected architecture x86-64.
Nov 25 18:07:34 localhost systemd[1]: Running in initrd.
Nov 25 18:07:34 localhost systemd[1]: No hostname configured, using default hostname.
Nov 25 18:07:34 localhost systemd[1]: Hostname set to <localhost>.
Nov 25 18:07:34 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 25 18:07:34 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 25 18:07:34 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 25 18:07:34 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 25 18:07:34 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 25 18:07:34 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 25 18:07:34 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 25 18:07:34 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 25 18:07:34 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 25 18:07:34 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 25 18:07:34 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 25 18:07:34 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 25 18:07:34 localhost systemd[1]: Reached target Local File Systems.
Nov 25 18:07:34 localhost systemd[1]: Reached target Path Units.
Nov 25 18:07:34 localhost systemd[1]: Reached target Slice Units.
Nov 25 18:07:34 localhost systemd[1]: Reached target Swaps.
Nov 25 18:07:34 localhost systemd[1]: Reached target Timer Units.
Nov 25 18:07:34 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 25 18:07:34 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 25 18:07:34 localhost systemd[1]: Listening on Journal Socket.
Nov 25 18:07:34 localhost systemd[1]: Listening on udev Control Socket.
Nov 25 18:07:34 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 25 18:07:34 localhost systemd[1]: Reached target Socket Units.
Nov 25 18:07:34 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 25 18:07:34 localhost systemd[1]: Starting Journal Service...
Nov 25 18:07:34 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 25 18:07:34 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 25 18:07:34 localhost systemd[1]: Starting Create System Users...
Nov 25 18:07:34 localhost systemd[1]: Starting Setup Virtual Console...
Nov 25 18:07:34 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 25 18:07:34 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 25 18:07:34 localhost systemd[1]: Finished Create System Users.
Nov 25 18:07:34 localhost systemd-journald[307]: Journal started
Nov 25 18:07:34 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/67ab9541c16e406bbe77292a72d03114) is 8.0M, max 153.6M, 145.6M free.
Nov 25 18:07:34 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Nov 25 18:07:34 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Nov 25 18:07:34 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 25 18:07:34 localhost systemd[1]: Started Journal Service.
Nov 25 18:07:34 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 25 18:07:34 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 25 18:07:34 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 25 18:07:34 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 25 18:07:34 localhost systemd[1]: Finished Setup Virtual Console.
Nov 25 18:07:34 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 25 18:07:34 localhost systemd[1]: Starting dracut cmdline hook...
Nov 25 18:07:34 localhost dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Nov 25 18:07:34 localhost dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 25 18:07:34 localhost systemd[1]: Finished dracut cmdline hook.
Nov 25 18:07:34 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 25 18:07:34 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 25 18:07:34 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 25 18:07:34 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 25 18:07:34 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 25 18:07:34 localhost kernel: RPC: Registered udp transport module.
Nov 25 18:07:34 localhost kernel: RPC: Registered tcp transport module.
Nov 25 18:07:34 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 25 18:07:34 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 25 18:07:34 localhost rpc.statd[444]: Version 2.5.4 starting
Nov 25 18:07:34 localhost rpc.statd[444]: Initializing NSM state
Nov 25 18:07:34 localhost rpc.idmapd[449]: Setting log level to 0
Nov 25 18:07:34 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 25 18:07:35 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 25 18:07:35 localhost systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Nov 25 18:07:35 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 25 18:07:35 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 25 18:07:35 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 25 18:07:35 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 25 18:07:35 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 25 18:07:35 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 25 18:07:35 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 25 18:07:35 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 25 18:07:35 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 25 18:07:35 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 25 18:07:35 localhost systemd[1]: Reached target Network.
Nov 25 18:07:35 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 25 18:07:35 localhost systemd[1]: Starting dracut initqueue hook...
Nov 25 18:07:35 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 25 18:07:35 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 25 18:07:35 localhost systemd[1]: Reached target System Initialization.
Nov 25 18:07:35 localhost systemd[1]: Reached target Basic System.
Nov 25 18:07:35 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 25 18:07:35 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 25 18:07:35 localhost kernel: libata version 3.00 loaded.
Nov 25 18:07:35 localhost kernel:  vda: vda1
Nov 25 18:07:35 localhost systemd-udevd[477]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 18:07:35 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 25 18:07:35 localhost kernel: scsi host0: ata_piix
Nov 25 18:07:35 localhost kernel: scsi host1: ata_piix
Nov 25 18:07:35 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 25 18:07:35 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 25 18:07:35 localhost systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 25 18:07:35 localhost systemd[1]: Reached target Initrd Root Device.
Nov 25 18:07:35 localhost kernel: ata1: found unknown device (class 0)
Nov 25 18:07:35 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 25 18:07:35 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 25 18:07:35 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 25 18:07:35 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 25 18:07:35 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 25 18:07:35 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 25 18:07:35 localhost systemd[1]: Finished dracut initqueue hook.
Nov 25 18:07:35 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 25 18:07:35 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 25 18:07:35 localhost systemd[1]: Reached target Remote File Systems.
Nov 25 18:07:35 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 25 18:07:35 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 25 18:07:35 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 25 18:07:35 localhost systemd-fsck[558]: /usr/sbin/fsck.xfs: XFS file system.
Nov 25 18:07:35 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 25 18:07:35 localhost systemd[1]: Mounting /sysroot...
Nov 25 18:07:36 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 25 18:07:36 localhost kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 25 18:07:36 localhost kernel: XFS (vda1): Ending clean mount
Nov 25 18:07:36 localhost systemd[1]: Mounted /sysroot.
Nov 25 18:07:36 localhost systemd[1]: Reached target Initrd Root File System.
Nov 25 18:07:36 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 25 18:07:36 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 25 18:07:36 localhost systemd[1]: Reached target Initrd File Systems.
Nov 25 18:07:36 localhost systemd[1]: Reached target Initrd Default Target.
Nov 25 18:07:36 localhost systemd[1]: Starting dracut mount hook...
Nov 25 18:07:36 localhost systemd[1]: Finished dracut mount hook.
Nov 25 18:07:36 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 25 18:07:36 localhost rpc.idmapd[449]: exiting on signal 15
Nov 25 18:07:36 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 25 18:07:36 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 25 18:07:36 localhost systemd[1]: Stopped target Network.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Timer Units.
Nov 25 18:07:36 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 25 18:07:36 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Basic System.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Path Units.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Remote File Systems.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Slice Units.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Socket Units.
Nov 25 18:07:36 localhost systemd[1]: Stopped target System Initialization.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Local File Systems.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Swaps.
Nov 25 18:07:36 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped dracut mount hook.
Nov 25 18:07:36 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 25 18:07:36 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 25 18:07:36 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 25 18:07:36 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 25 18:07:36 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 25 18:07:36 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 25 18:07:36 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 25 18:07:36 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 25 18:07:36 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 25 18:07:36 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 25 18:07:36 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 25 18:07:36 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 25 18:07:36 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Closed udev Control Socket.
Nov 25 18:07:36 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Closed udev Kernel Socket.
Nov 25 18:07:36 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 25 18:07:36 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 25 18:07:36 localhost systemd[1]: Starting Cleanup udev Database...
Nov 25 18:07:36 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 25 18:07:36 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 25 18:07:36 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Stopped Create System Users.
Nov 25 18:07:36 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 25 18:07:36 localhost systemd[1]: Finished Cleanup udev Database.
Nov 25 18:07:36 localhost systemd[1]: Reached target Switch Root.
Nov 25 18:07:36 localhost systemd[1]: Starting Switch Root...
Nov 25 18:07:36 localhost systemd[1]: Switching root.
Nov 25 18:07:36 localhost systemd-journald[307]: Journal stopped
Nov 25 18:07:37 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Nov 25 18:07:37 localhost kernel: audit: type=1404 audit(1764094056.961:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 25 18:07:37 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:07:37 localhost kernel: SELinux:  policy capability open_perms=1
Nov 25 18:07:37 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:07:37 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:07:37 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:07:37 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:07:37 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:07:37 localhost kernel: audit: type=1403 audit(1764094057.140:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 25 18:07:37 localhost systemd[1]: Successfully loaded SELinux policy in 183.797ms.
Nov 25 18:07:37 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.667ms.
Nov 25 18:07:37 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 25 18:07:37 localhost systemd[1]: Detected virtualization kvm.
Nov 25 18:07:37 localhost systemd[1]: Detected architecture x86-64.
Nov 25 18:07:37 localhost systemd-rc-local-generator[639]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:07:37 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 25 18:07:37 localhost systemd[1]: Stopped Switch Root.
Nov 25 18:07:37 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 25 18:07:37 localhost systemd[1]: Created slice Slice /system/getty.
Nov 25 18:07:37 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 25 18:07:37 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 25 18:07:37 localhost systemd[1]: Created slice User and Session Slice.
Nov 25 18:07:37 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 25 18:07:37 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 25 18:07:37 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 25 18:07:37 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 25 18:07:37 localhost systemd[1]: Stopped target Switch Root.
Nov 25 18:07:37 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 25 18:07:37 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 25 18:07:37 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 25 18:07:37 localhost systemd[1]: Reached target Path Units.
Nov 25 18:07:37 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 25 18:07:37 localhost systemd[1]: Reached target Slice Units.
Nov 25 18:07:37 localhost systemd[1]: Reached target Swaps.
Nov 25 18:07:37 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 25 18:07:37 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 25 18:07:37 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 25 18:07:37 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 25 18:07:37 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 25 18:07:37 localhost systemd[1]: Listening on udev Control Socket.
Nov 25 18:07:37 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 25 18:07:37 localhost systemd[1]: Mounting Huge Pages File System...
Nov 25 18:07:37 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 25 18:07:37 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 25 18:07:37 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 25 18:07:37 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 25 18:07:37 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 25 18:07:37 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 25 18:07:37 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 25 18:07:37 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 25 18:07:37 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 25 18:07:37 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 25 18:07:37 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 25 18:07:37 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 25 18:07:37 localhost systemd[1]: Stopped Journal Service.
Nov 25 18:07:37 localhost kernel: fuse: init (API version 7.37)
Nov 25 18:07:37 localhost systemd[1]: Starting Journal Service...
Nov 25 18:07:37 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 25 18:07:37 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 25 18:07:37 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 25 18:07:37 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 25 18:07:37 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 25 18:07:37 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 25 18:07:37 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 25 18:07:37 localhost systemd[1]: Mounted Huge Pages File System.
Nov 25 18:07:37 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 25 18:07:37 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 25 18:07:37 localhost systemd-journald[680]: Journal started
Nov 25 18:07:37 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 25 18:07:37 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 25 18:07:37 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 25 18:07:37 localhost systemd[1]: Started Journal Service.
Nov 25 18:07:37 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 25 18:07:37 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 25 18:07:37 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 25 18:07:37 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 25 18:07:37 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 25 18:07:37 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 25 18:07:37 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 25 18:07:37 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 25 18:07:37 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 25 18:07:37 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 25 18:07:37 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 25 18:07:37 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 25 18:07:37 localhost kernel: ACPI: bus type drm_connector registered
Nov 25 18:07:37 localhost systemd[1]: Mounting FUSE Control File System...
Nov 25 18:07:37 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 25 18:07:37 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 25 18:07:37 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 25 18:07:37 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 25 18:07:37 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 25 18:07:37 localhost systemd[1]: Starting Create System Users...
Nov 25 18:07:37 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 25 18:07:37 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 25 18:07:37 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 25 18:07:37 localhost systemd-journald[680]: Received client request to flush runtime journal.
Nov 25 18:07:37 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 25 18:07:37 localhost systemd[1]: Mounted FUSE Control File System.
Nov 25 18:07:37 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 25 18:07:37 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 25 18:07:37 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 25 18:07:37 localhost systemd[1]: Finished Create System Users.
Nov 25 18:07:38 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 25 18:07:38 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 25 18:07:38 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 25 18:07:38 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 25 18:07:38 localhost systemd[1]: Reached target Local File Systems.
Nov 25 18:07:38 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 25 18:07:38 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 25 18:07:38 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 25 18:07:38 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 25 18:07:38 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 25 18:07:38 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 25 18:07:38 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 25 18:07:38 localhost bootctl[698]: Couldn't find EFI system partition, skipping.
Nov 25 18:07:38 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 25 18:07:38 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 25 18:07:38 localhost systemd[1]: Starting Security Auditing Service...
Nov 25 18:07:38 localhost systemd[1]: Starting RPC Bind...
Nov 25 18:07:38 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 25 18:07:38 localhost auditd[704]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 25 18:07:38 localhost auditd[704]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 25 18:07:38 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 25 18:07:38 localhost systemd[1]: Started RPC Bind.
Nov 25 18:07:38 localhost augenrules[709]: /sbin/augenrules: No change
Nov 25 18:07:38 localhost augenrules[724]: No rules
Nov 25 18:07:38 localhost augenrules[724]: enabled 1
Nov 25 18:07:38 localhost augenrules[724]: failure 1
Nov 25 18:07:38 localhost augenrules[724]: pid 704
Nov 25 18:07:38 localhost augenrules[724]: rate_limit 0
Nov 25 18:07:38 localhost augenrules[724]: backlog_limit 8192
Nov 25 18:07:38 localhost augenrules[724]: lost 0
Nov 25 18:07:38 localhost augenrules[724]: backlog 0
Nov 25 18:07:38 localhost augenrules[724]: backlog_wait_time 60000
Nov 25 18:07:38 localhost augenrules[724]: backlog_wait_time_actual 0
Nov 25 18:07:38 localhost augenrules[724]: enabled 1
Nov 25 18:07:38 localhost augenrules[724]: failure 1
Nov 25 18:07:38 localhost augenrules[724]: pid 704
Nov 25 18:07:38 localhost augenrules[724]: rate_limit 0
Nov 25 18:07:38 localhost augenrules[724]: backlog_limit 8192
Nov 25 18:07:38 localhost augenrules[724]: lost 0
Nov 25 18:07:38 localhost augenrules[724]: backlog 0
Nov 25 18:07:38 localhost augenrules[724]: backlog_wait_time 60000
Nov 25 18:07:38 localhost augenrules[724]: backlog_wait_time_actual 0
Nov 25 18:07:38 localhost augenrules[724]: enabled 1
Nov 25 18:07:38 localhost augenrules[724]: failure 1
Nov 25 18:07:38 localhost augenrules[724]: pid 704
Nov 25 18:07:38 localhost augenrules[724]: rate_limit 0
Nov 25 18:07:38 localhost augenrules[724]: backlog_limit 8192
Nov 25 18:07:38 localhost augenrules[724]: lost 0
Nov 25 18:07:38 localhost augenrules[724]: backlog 0
Nov 25 18:07:38 localhost augenrules[724]: backlog_wait_time 60000
Nov 25 18:07:38 localhost augenrules[724]: backlog_wait_time_actual 0
Nov 25 18:07:38 localhost systemd[1]: Started Security Auditing Service.
Nov 25 18:07:38 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 25 18:07:38 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 25 18:07:38 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 25 18:07:38 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 25 18:07:38 localhost systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Nov 25 18:07:38 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 25 18:07:38 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 25 18:07:38 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 25 18:07:38 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 25 18:07:38 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 25 18:07:38 localhost systemd-udevd[735]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 18:07:38 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 25 18:07:38 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 25 18:07:38 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 25 18:07:38 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 25 18:07:38 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 25 18:07:38 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 25 18:07:38 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 25 18:07:38 localhost kernel: Console: switching to colour dummy device 80x25
Nov 25 18:07:38 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 25 18:07:38 localhost kernel: [drm] features: -context_init
Nov 25 18:07:38 localhost kernel: [drm] number of scanouts: 1
Nov 25 18:07:38 localhost kernel: [drm] number of cap sets: 0
Nov 25 18:07:38 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 25 18:07:38 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 25 18:07:38 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 25 18:07:38 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 25 18:07:38 localhost kernel: kvm_amd: TSC scaling supported
Nov 25 18:07:38 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 25 18:07:38 localhost kernel: kvm_amd: Nested Paging enabled
Nov 25 18:07:38 localhost kernel: kvm_amd: LBR virtualization supported
Nov 25 18:07:39 localhost systemd[1]: Starting Update is Completed...
Nov 25 18:07:39 localhost systemd[1]: Finished Update is Completed.
Nov 25 18:07:39 localhost systemd[1]: Reached target System Initialization.
Nov 25 18:07:39 localhost systemd[1]: Started dnf makecache --timer.
Nov 25 18:07:39 localhost systemd[1]: Started Daily rotation of log files.
Nov 25 18:07:39 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 25 18:07:39 localhost systemd[1]: Reached target Timer Units.
Nov 25 18:07:39 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 25 18:07:39 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 25 18:07:39 localhost systemd[1]: Reached target Socket Units.
Nov 25 18:07:39 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 25 18:07:39 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 25 18:07:39 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 25 18:07:39 localhost systemd[1]: Reached target Basic System.
Nov 25 18:07:39 localhost dbus-broker-lau[791]: Ready
Nov 25 18:07:39 localhost systemd[1]: Starting NTP client/server...
Nov 25 18:07:39 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 25 18:07:39 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 25 18:07:39 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 25 18:07:39 localhost systemd[1]: Started irqbalance daemon.
Nov 25 18:07:39 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 25 18:07:39 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 18:07:39 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 18:07:39 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 18:07:39 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 25 18:07:39 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 25 18:07:39 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 25 18:07:39 localhost systemd[1]: Starting User Login Management...
Nov 25 18:07:39 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 25 18:07:39 localhost chronyd[833]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 25 18:07:39 localhost chronyd[833]: Loaded 0 symmetric keys
Nov 25 18:07:39 localhost chronyd[833]: Using right/UTC timezone to obtain leap second data
Nov 25 18:07:39 localhost chronyd[833]: Loaded seccomp filter (level 2)
Nov 25 18:07:39 localhost systemd[1]: Started NTP client/server.
Nov 25 18:07:39 localhost systemd-logind[820]: New seat seat0.
Nov 25 18:07:39 localhost systemd-logind[820]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 25 18:07:39 localhost systemd-logind[820]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 25 18:07:39 localhost systemd[1]: Started User Login Management.
Nov 25 18:07:39 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 25 18:07:39 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 25 18:07:39 localhost iptables.init[818]: iptables: Applying firewall rules: [  OK  ]
Nov 25 18:07:39 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 25 18:07:40 localhost cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 25 Nov 2025 18:07:40 +0000. Up 8.01 seconds.
Nov 25 18:07:40 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 25 18:07:40 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 25 18:07:40 localhost systemd[1]: run-cloud\x2dinit-tmp-tmppu5jy0jp.mount: Deactivated successfully.
Nov 25 18:07:40 localhost systemd[1]: Starting Hostname Service...
Nov 25 18:07:40 localhost systemd[1]: Started Hostname Service.
Nov 25 18:07:40 np0005535692.novalocal systemd-hostnamed[856]: Hostname set to <np0005535692.novalocal> (static)
Nov 25 18:07:40 np0005535692.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 25 18:07:40 np0005535692.novalocal systemd[1]: Reached target Preparation for Network.
Nov 25 18:07:40 np0005535692.novalocal systemd[1]: Starting Network Manager...
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9466] NetworkManager (version 1.54.1-1.el9) is starting... (boot:5314dbd0-f0d3-4d8c-818c-96beee19bec6)
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9471] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9615] manager[0x55a211d05080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9663] hostname: hostname: using hostnamed
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9663] hostname: static hostname changed from (none) to "np0005535692.novalocal"
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9668] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9835] manager[0x55a211d05080]: rfkill: Wi-Fi hardware radio set enabled
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9835] manager[0x55a211d05080]: rfkill: WWAN hardware radio set enabled
Nov 25 18:07:40 np0005535692.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9940] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9942] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9944] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9946] manager: Networking is enabled by state file
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9950] settings: Loaded settings plugin: keyfile (internal)
Nov 25 18:07:40 np0005535692.novalocal NetworkManager[860]: <info>  [1764094060.9985] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0010] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0037] dhcp: init: Using DHCP client 'internal'
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0039] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0051] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0065] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0072] device (lo): Activation: starting connection 'lo' (7ebbea74-d1bb-4fb2-acd3-42edf212bfe7)
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0080] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0084] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0109] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0113] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0116] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0118] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0120] device (eth0): carrier: link connected
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0123] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0129] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0136] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0140] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0141] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0144] manager: NetworkManager state is now CONNECTING
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0146] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0153] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0156] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0197] dhcp4 (eth0): state changed new lease, address=38.102.83.177
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Started Network Manager.
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0206] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0226] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Reached target Network.
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0427] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0430] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0433] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0442] device (lo): Activation: successful, device activated.
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0451] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0456] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0462] device (eth0): Activation: successful, device activated.
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0469] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 25 18:07:41 np0005535692.novalocal NetworkManager[860]: <info>  [1764094061.0473] manager: startup complete
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Reached target NFS client services.
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Reached target Remote File Systems.
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 25 18:07:41 np0005535692.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 25 Nov 2025 18:07:41 +0000. Up 9.09 seconds.
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: |  eth0  | True |        38.102.83.177         | 255.255.255.0 | global | fa:16:3e:86:18:2d |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: |  eth0  | True | fe80::f816:3eff:fe86:182d/64 |       .       |  link  | fa:16:3e:86:18:2d |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 25 18:07:41 np0005535692.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 25 18:07:42 np0005535692.novalocal useradd[991]: new group: name=cloud-user, GID=1001
Nov 25 18:07:42 np0005535692.novalocal useradd[991]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 25 18:07:42 np0005535692.novalocal useradd[991]: add 'cloud-user' to group 'adm'
Nov 25 18:07:42 np0005535692.novalocal useradd[991]: add 'cloud-user' to group 'systemd-journal'
Nov 25 18:07:42 np0005535692.novalocal useradd[991]: add 'cloud-user' to shadow group 'adm'
Nov 25 18:07:42 np0005535692.novalocal useradd[991]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: Generating public/private rsa key pair.
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: The key fingerprint is:
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: SHA256:pY/Xb4tNBwD2+S5I2UdZJxrzuUVlqrt3hHDinKm2hH0 root@np0005535692.novalocal
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: The key's randomart image is:
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: +---[RSA 3072]----+
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |          o o . B|
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |         . o * O.|
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |          . = * .|
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |         o oo=.o |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |        S ooo*=. |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |         * o=+...|
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |        o *.E o..|
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |         oo. Bo..|
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |         ...oo+o |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: +----[SHA256]-----+
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: Generating public/private ecdsa key pair.
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: The key fingerprint is:
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: SHA256:/4IqI4CT6a0vycv/Y1+20xzlU2mEI0whcPQ7+5Mdspo root@np0005535692.novalocal
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: The key's randomart image is:
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: +---[ECDSA 256]---+
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |       .o+oo. .  |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |        . oo o . |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |           .. o .|
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |            .. + |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |.o      S  oo o  |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |*        . .o+ . |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |ooo      ++.. * .|
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |.=..oo  +.o+o+ . |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: | +*+o++o ..Eo..  |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: +----[SHA256]-----+
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: Generating public/private ed25519 key pair.
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: The key fingerprint is:
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: SHA256:LbWhsQsI2+IPkQygKqO+2ZJ0GGPLF9Soqy8SpC03BIg root@np0005535692.novalocal
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: The key's randomart image is:
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: +--[ED25519 256]--+
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |+   o            |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |E  o .           |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |o.+     . o      |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |.O.* .   * o     |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |Oo% + . S o      |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |*O+=   . o       |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |++*.    .        |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |=ooo             |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: |.*+..            |
Nov 25 18:07:42 np0005535692.novalocal cloud-init[924]: +----[SHA256]-----+
Nov 25 18:07:43 np0005535692.novalocal sm-notify[1007]: Version 2.5.4 starting
Nov 25 18:07:42 np0005535692.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 25 18:07:43 np0005535692.novalocal sshd[1009]: Server listening on 0.0.0.0 port 22.
Nov 25 18:07:42 np0005535692.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 25 18:07:43 np0005535692.novalocal sshd[1009]: Server listening on :: port 22.
Nov 25 18:07:42 np0005535692.novalocal systemd[1]: Reached target Network is Online.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Starting System Logging Service...
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Starting Permit User Sessions...
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Finished Permit User Sessions.
Nov 25 18:07:43 np0005535692.novalocal rsyslogd[1008]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1008" x-info="https://www.rsyslog.com"] start
Nov 25 18:07:43 np0005535692.novalocal rsyslogd[1008]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Started Command Scheduler.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Started Getty on tty1.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Reached target Login Prompts.
Nov 25 18:07:43 np0005535692.novalocal crond[1012]: (CRON) STARTUP (1.5.7)
Nov 25 18:07:43 np0005535692.novalocal crond[1012]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 25 18:07:43 np0005535692.novalocal crond[1012]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 37% if used.)
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Started System Logging Service.
Nov 25 18:07:43 np0005535692.novalocal crond[1012]: (CRON) INFO (running with inotify support)
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Reached target Multi-User System.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 25 18:07:43 np0005535692.novalocal rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 25 18:07:43 np0005535692.novalocal kdumpctl[1021]: kdump: No kdump initial ramdisk found.
Nov 25 18:07:43 np0005535692.novalocal kdumpctl[1021]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1105]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 25 Nov 2025 18:07:43 +0000. Up 11.04 seconds.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1240]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 25 Nov 2025 18:07:43 +0000. Up 11.39 seconds.
Nov 25 18:07:43 np0005535692.novalocal sshd-session[1243]: Connection reset by 38.102.83.114 port 56174 [preauth]
Nov 25 18:07:43 np0005535692.novalocal sshd-session[1255]: Unable to negotiate with 38.102.83.114 port 56184: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 25 18:07:43 np0005535692.novalocal sshd-session[1261]: Connection reset by 38.102.83.114 port 56198 [preauth]
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1269]: #############################################################
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1273]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1278]: 256 SHA256:/4IqI4CT6a0vycv/Y1+20xzlU2mEI0whcPQ7+5Mdspo root@np0005535692.novalocal (ECDSA)
Nov 25 18:07:43 np0005535692.novalocal sshd-session[1274]: Unable to negotiate with 38.102.83.114 port 56202: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1283]: 256 SHA256:LbWhsQsI2+IPkQygKqO+2ZJ0GGPLF9Soqy8SpC03BIg root@np0005535692.novalocal (ED25519)
Nov 25 18:07:43 np0005535692.novalocal sshd-session[1285]: Unable to negotiate with 38.102.83.114 port 56208: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1291]: 3072 SHA256:pY/Xb4tNBwD2+S5I2UdZJxrzuUVlqrt3hHDinKm2hH0 root@np0005535692.novalocal (RSA)
Nov 25 18:07:43 np0005535692.novalocal dracut[1289]: dracut-057-102.git20250818.el9
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1292]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1295]: #############################################################
Nov 25 18:07:43 np0005535692.novalocal cloud-init[1240]: Cloud-init v. 24.4-7.el9 finished at Tue, 25 Nov 2025 18:07:43 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.60 seconds
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 25 18:07:43 np0005535692.novalocal systemd[1]: Reached target Cloud-init target.
Nov 25 18:07:43 np0005535692.novalocal sshd-session[1314]: Connection reset by 38.102.83.114 port 56234 [preauth]
Nov 25 18:07:43 np0005535692.novalocal sshd-session[1316]: Unable to negotiate with 38.102.83.114 port 56248: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 25 18:07:43 np0005535692.novalocal sshd-session[1319]: Unable to negotiate with 38.102.83.114 port 56264: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 25 18:07:44 np0005535692.novalocal sshd-session[1293]: Connection closed by 38.102.83.114 port 56222 [preauth]
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 25 18:07:44 np0005535692.novalocal dracut[1296]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: memstrack is not available
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: memstrack is not available
Nov 25 18:07:45 np0005535692.novalocal dracut[1296]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 25 18:07:45 np0005535692.novalocal chronyd[833]: Selected source 23.159.16.194 (2.centos.pool.ntp.org)
Nov 25 18:07:45 np0005535692.novalocal chronyd[833]: System clock TAI offset set to 37 seconds
Nov 25 18:07:46 np0005535692.novalocal dracut[1296]: *** Including module: systemd ***
Nov 25 18:07:46 np0005535692.novalocal dracut[1296]: *** Including module: fips ***
Nov 25 18:07:47 np0005535692.novalocal dracut[1296]: *** Including module: systemd-initrd ***
Nov 25 18:07:47 np0005535692.novalocal dracut[1296]: *** Including module: i18n ***
Nov 25 18:07:47 np0005535692.novalocal dracut[1296]: *** Including module: drm ***
Nov 25 18:07:47 np0005535692.novalocal dracut[1296]: *** Including module: prefixdevname ***
Nov 25 18:07:47 np0005535692.novalocal dracut[1296]: *** Including module: kernel-modules ***
Nov 25 18:07:47 np0005535692.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]: *** Including module: kernel-modules-extra ***
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]: *** Including module: qemu ***
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]: *** Including module: fstab-sys ***
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]: *** Including module: rootfs-block ***
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]: *** Including module: terminfo ***
Nov 25 18:07:48 np0005535692.novalocal dracut[1296]: *** Including module: udev-rules ***
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: IRQ 25 affinity is now unmanaged
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: IRQ 31 affinity is now unmanaged
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: IRQ 28 affinity is now unmanaged
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: IRQ 32 affinity is now unmanaged
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: IRQ 30 affinity is now unmanaged
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 25 18:07:49 np0005535692.novalocal irqbalance[819]: IRQ 29 affinity is now unmanaged
Nov 25 18:07:49 np0005535692.novalocal dracut[1296]: Skipping udev rule: 91-permissions.rules
Nov 25 18:07:49 np0005535692.novalocal dracut[1296]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 25 18:07:49 np0005535692.novalocal dracut[1296]: *** Including module: virtiofs ***
Nov 25 18:07:49 np0005535692.novalocal dracut[1296]: *** Including module: dracut-systemd ***
Nov 25 18:07:49 np0005535692.novalocal dracut[1296]: *** Including module: usrmount ***
Nov 25 18:07:49 np0005535692.novalocal dracut[1296]: *** Including module: base ***
Nov 25 18:07:49 np0005535692.novalocal dracut[1296]: *** Including module: fs-lib ***
Nov 25 18:07:49 np0005535692.novalocal dracut[1296]: *** Including module: kdumpbase ***
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:   microcode_ctl module: mangling fw_dir
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]: *** Including module: openssl ***
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]: *** Including module: shutdown ***
Nov 25 18:07:50 np0005535692.novalocal dracut[1296]: *** Including module: squash ***
Nov 25 18:07:51 np0005535692.novalocal dracut[1296]: *** Including modules done ***
Nov 25 18:07:51 np0005535692.novalocal dracut[1296]: *** Installing kernel module dependencies ***
Nov 25 18:07:51 np0005535692.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 18:07:51 np0005535692.novalocal dracut[1296]: *** Installing kernel module dependencies done ***
Nov 25 18:07:51 np0005535692.novalocal dracut[1296]: *** Resolving executable dependencies ***
Nov 25 18:07:53 np0005535692.novalocal dracut[1296]: *** Resolving executable dependencies done ***
Nov 25 18:07:53 np0005535692.novalocal dracut[1296]: *** Generating early-microcode cpio image ***
Nov 25 18:07:53 np0005535692.novalocal dracut[1296]: *** Store current command line parameters ***
Nov 25 18:07:53 np0005535692.novalocal dracut[1296]: Stored kernel commandline:
Nov 25 18:07:53 np0005535692.novalocal dracut[1296]: No dracut internal kernel commandline stored in the initramfs
Nov 25 18:07:53 np0005535692.novalocal dracut[1296]: *** Install squash loader ***
Nov 25 18:07:54 np0005535692.novalocal dracut[1296]: *** Squashing the files inside the initramfs ***
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: *** Squashing the files inside the initramfs done ***
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: *** Hardlinking files ***
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: Mode:           real
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: Files:          50
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: Linked:         0 files
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: Compared:       0 xattrs
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: Compared:       0 files
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: Saved:          0 B
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: Duration:       0.000370 seconds
Nov 25 18:07:55 np0005535692.novalocal dracut[1296]: *** Hardlinking files done ***
Nov 25 18:07:56 np0005535692.novalocal dracut[1296]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 25 18:07:56 np0005535692.novalocal kdumpctl[1021]: kdump: kexec: loaded kdump kernel
Nov 25 18:07:56 np0005535692.novalocal kdumpctl[1021]: kdump: Starting kdump: [OK]
Nov 25 18:07:56 np0005535692.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 25 18:07:56 np0005535692.novalocal systemd[1]: Startup finished in 1.634s (kernel) + 3.031s (initrd) + 19.967s (userspace) = 24.633s.
Nov 25 18:08:11 np0005535692.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 18:08:34 np0005535692.novalocal sshd-session[4301]: Accepted publickey for zuul from 38.102.83.114 port 46040 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 25 18:08:34 np0005535692.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 25 18:08:34 np0005535692.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 25 18:08:34 np0005535692.novalocal systemd-logind[820]: New session 1 of user zuul.
Nov 25 18:08:34 np0005535692.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 25 18:08:34 np0005535692.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Queued start job for default target Main User Target.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Created slice User Application Slice.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Reached target Paths.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Reached target Timers.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Starting D-Bus User Message Bus Socket...
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Starting Create User's Volatile Files and Directories...
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Listening on D-Bus User Message Bus Socket.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Reached target Sockets.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Finished Create User's Volatile Files and Directories.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Reached target Basic System.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Reached target Main User Target.
Nov 25 18:08:34 np0005535692.novalocal systemd[4305]: Startup finished in 151ms.
Nov 25 18:08:34 np0005535692.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 25 18:08:34 np0005535692.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 25 18:08:34 np0005535692.novalocal sshd-session[4301]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:08:35 np0005535692.novalocal python3[4387]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:08:38 np0005535692.novalocal python3[4415]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:08:50 np0005535692.novalocal python3[4473]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:08:51 np0005535692.novalocal python3[4513]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 25 18:08:53 np0005535692.novalocal python3[4539]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrTlIIJ/xsmz4EXUu/e0saT+NTLMrlpaklh/B6vtRjfX+OtOgoN1Fcrf/+Y0Xp4zI7rA1nDpJzJNIrybfuJVeX5HMI+OjBnVGItRjUMVngWCo8OPzj53zuUReFycGeUqMyTlC0PlaEO/wwDvAbTsiv0Mwi/gdOTOt84MEtSbo/3XJW6C4mpepIIn4IMXvMO36JqhsxzQjL0PSOr3vZvu+1ZeU6KE4PFARCr4/biEI0rOBhI6Zekl8Yp4woC0JTBJUE9AEiKjsUTAn0Rbxz0S7ivrldSPEsNLT42koaZqQTSULpqjxPXj6p+KvAfBVPMNV1BSEse/HDiQZJppJHdkW+KtJ6uNg7D8HEMk0j2X9JeL7spMg51nKTUgGXqrDaJDjk4vpFlTfIvlGiSm0J6/NCENIw2uKxy51Hfy3MxO45YmTs8jkImZcxgjsNPNoaiab8UoZIw0/FnXbbsGV57XT+S8I8K/xYsoxv0eoSDa4ypGZoFIaveQ2b3WgxofKSFsU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:08:54 np0005535692.novalocal python3[4563]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:08:54 np0005535692.novalocal python3[4662]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:08:54 np0005535692.novalocal python3[4733]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764094134.1810958-229-173741389849493/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=942c878f5ee64e30b7b132b1d3720a65_id_rsa follow=False checksum=88a90532aa0b09891a9a7e002ea152e520961bd8 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:08:55 np0005535692.novalocal python3[4856]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:08:55 np0005535692.novalocal python3[4927]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764094135.2132502-273-240647702442812/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=942c878f5ee64e30b7b132b1d3720a65_id_rsa.pub follow=False checksum=5d0358b4244abf83c87b70e52d5aed49519ec656 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:08:57 np0005535692.novalocal python3[4975]: ansible-ping Invoked with data=pong
Nov 25 18:08:58 np0005535692.novalocal python3[4999]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:09:01 np0005535692.novalocal python3[5057]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 25 18:09:02 np0005535692.novalocal python3[5089]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:02 np0005535692.novalocal python3[5113]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:02 np0005535692.novalocal python3[5137]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:03 np0005535692.novalocal python3[5161]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:03 np0005535692.novalocal python3[5185]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:03 np0005535692.novalocal python3[5209]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:05 np0005535692.novalocal sudo[5233]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hczdfmgfioiylmodcpanltvewimohnlq ; /usr/bin/python3'
Nov 25 18:09:05 np0005535692.novalocal sudo[5233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:05 np0005535692.novalocal python3[5235]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:05 np0005535692.novalocal sudo[5233]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:06 np0005535692.novalocal sudo[5311]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygvolubdwvjflolverddwalhzgoohwlc ; /usr/bin/python3'
Nov 25 18:09:06 np0005535692.novalocal sudo[5311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:06 np0005535692.novalocal python3[5313]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:09:06 np0005535692.novalocal sudo[5311]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:06 np0005535692.novalocal sudo[5384]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klafhsszjdbgngzggyicxmbtgviexzih ; /usr/bin/python3'
Nov 25 18:09:06 np0005535692.novalocal sudo[5384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:06 np0005535692.novalocal python3[5386]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764094145.69597-26-56020186507339/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:06 np0005535692.novalocal sudo[5384]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:07 np0005535692.novalocal python3[5434]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:07 np0005535692.novalocal python3[5458]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:08 np0005535692.novalocal python3[5482]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:08 np0005535692.novalocal python3[5506]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:08 np0005535692.novalocal python3[5530]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:08 np0005535692.novalocal python3[5554]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:09 np0005535692.novalocal python3[5578]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:09 np0005535692.novalocal python3[5602]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:09 np0005535692.novalocal python3[5626]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:10 np0005535692.novalocal python3[5650]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:10 np0005535692.novalocal python3[5674]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:10 np0005535692.novalocal python3[5698]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:10 np0005535692.novalocal python3[5722]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:11 np0005535692.novalocal python3[5746]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:11 np0005535692.novalocal python3[5770]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:11 np0005535692.novalocal python3[5794]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:12 np0005535692.novalocal python3[5818]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:12 np0005535692.novalocal python3[5842]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:12 np0005535692.novalocal python3[5866]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:12 np0005535692.novalocal python3[5890]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:13 np0005535692.novalocal python3[5914]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:13 np0005535692.novalocal python3[5938]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:13 np0005535692.novalocal python3[5962]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:14 np0005535692.novalocal python3[5986]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:14 np0005535692.novalocal python3[6010]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:14 np0005535692.novalocal python3[6034]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:09:16 np0005535692.novalocal sudo[6058]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbqrtwahvrfayhusngpxvbfpluxiyjrw ; /usr/bin/python3'
Nov 25 18:09:16 np0005535692.novalocal sudo[6058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:17 np0005535692.novalocal python3[6060]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 25 18:09:17 np0005535692.novalocal systemd[1]: Starting Time & Date Service...
Nov 25 18:09:17 np0005535692.novalocal systemd[1]: Started Time & Date Service.
Nov 25 18:09:17 np0005535692.novalocal systemd-timedated[6062]: Changed time zone to 'UTC' (UTC).
Nov 25 18:09:17 np0005535692.novalocal sudo[6058]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:17 np0005535692.novalocal sudo[6089]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvzefcjmoioosoijldqohentjlgezcvg ; /usr/bin/python3'
Nov 25 18:09:17 np0005535692.novalocal sudo[6089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:17 np0005535692.novalocal python3[6091]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:17 np0005535692.novalocal sudo[6089]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:18 np0005535692.novalocal python3[6167]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:09:18 np0005535692.novalocal python3[6238]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764094157.9357076-202-4241775071234/source _original_basename=tmpy1zat_zn follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:19 np0005535692.novalocal python3[6338]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:09:19 np0005535692.novalocal python3[6409]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764094158.74075-242-206628142796284/source _original_basename=tmpagr0s_k5 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:20 np0005535692.novalocal sudo[6509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdpztowswuziykfazmvseyphahbrmdmp ; /usr/bin/python3'
Nov 25 18:09:20 np0005535692.novalocal sudo[6509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:20 np0005535692.novalocal python3[6511]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:09:20 np0005535692.novalocal sudo[6509]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:20 np0005535692.novalocal sudo[6582]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwjzhfxpiwmeguofgbszdfotzsxgsbbv ; /usr/bin/python3'
Nov 25 18:09:20 np0005535692.novalocal sudo[6582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:20 np0005535692.novalocal python3[6584]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764094159.8974626-306-224726520239961/source _original_basename=tmp7m46m7th follow=False checksum=1bcc824686558cc83916b394196cc422cefa4598 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:20 np0005535692.novalocal sudo[6582]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:21 np0005535692.novalocal python3[6632]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:09:21 np0005535692.novalocal python3[6658]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:09:21 np0005535692.novalocal sudo[6736]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaqheadugijnbpsaaxrtgqbkqnjjvomy ; /usr/bin/python3'
Nov 25 18:09:21 np0005535692.novalocal sudo[6736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:22 np0005535692.novalocal python3[6738]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:09:22 np0005535692.novalocal sudo[6736]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:22 np0005535692.novalocal sudo[6809]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctnshtboykedqgcbxqruujfsikyjfrnq ; /usr/bin/python3'
Nov 25 18:09:22 np0005535692.novalocal sudo[6809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:22 np0005535692.novalocal python3[6811]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764094161.783427-362-221206388351338/source _original_basename=tmpw3hzycdy follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:22 np0005535692.novalocal sudo[6809]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:22 np0005535692.novalocal sudo[6860]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wynpogiennycbracpxwoaqmrusdtsxlm ; /usr/bin/python3'
Nov 25 18:09:22 np0005535692.novalocal sudo[6860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:23 np0005535692.novalocal python3[6862]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-5f15-6fbf-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:09:23 np0005535692.novalocal sudo[6860]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:23 np0005535692.novalocal python3[6890]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-5f15-6fbf-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 25 18:09:25 np0005535692.novalocal python3[6918]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:29 np0005535692.novalocal irqbalance[819]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 25 18:09:29 np0005535692.novalocal irqbalance[819]: IRQ 26 affinity is now unmanaged
Nov 25 18:09:44 np0005535692.novalocal sudo[6942]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brrdglaoujdvgktzdbpdmjalvbagxlhx ; /usr/bin/python3'
Nov 25 18:09:44 np0005535692.novalocal sudo[6942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:09:44 np0005535692.novalocal python3[6944]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:09:44 np0005535692.novalocal sudo[6942]: pam_unix(sudo:session): session closed for user root
Nov 25 18:09:47 np0005535692.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 25 18:10:23 np0005535692.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 25 18:10:23 np0005535692.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 25 18:10:23 np0005535692.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 25 18:10:23 np0005535692.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 25 18:10:23 np0005535692.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 25 18:10:23 np0005535692.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 25 18:10:23 np0005535692.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 25 18:10:23 np0005535692.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 25 18:10:23 np0005535692.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 25 18:10:23 np0005535692.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4244] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 25 18:10:23 np0005535692.novalocal systemd-udevd[6947]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4545] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4583] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4588] device (eth1): carrier: link connected
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4592] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4601] policy: auto-activating connection 'Wired connection 1' (52d4efbe-1a37-380b-ae77-c8d713488d53)
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4607] device (eth1): Activation: starting connection 'Wired connection 1' (52d4efbe-1a37-380b-ae77-c8d713488d53)
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4608] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4612] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4617] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:10:23 np0005535692.novalocal NetworkManager[860]: <info>  [1764094223.4624] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 25 18:10:24 np0005535692.novalocal python3[6974]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-d183-9b28-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:10:30 np0005535692.novalocal sshd-session[6977]: Connection closed by authenticating user root 171.244.51.45 port 40098 [preauth]
Nov 25 18:10:31 np0005535692.novalocal sudo[7054]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdxpzwcnkwnovjwrecccjrzmsgxtzaay ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 25 18:10:31 np0005535692.novalocal sudo[7054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:10:31 np0005535692.novalocal python3[7056]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:10:31 np0005535692.novalocal sudo[7054]: pam_unix(sudo:session): session closed for user root
Nov 25 18:10:31 np0005535692.novalocal sudo[7127]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfnmqcvrngygmmjpxblewwspvuciezlp ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 25 18:10:31 np0005535692.novalocal sudo[7127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:10:31 np0005535692.novalocal python3[7129]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764094231.045231-103-246533473973743/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=744284923d36d0b4baf6d0421a74a99b210eaba0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:10:31 np0005535692.novalocal sudo[7127]: pam_unix(sudo:session): session closed for user root
Nov 25 18:10:32 np0005535692.novalocal sudo[7177]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmopuwzlfqhchzlfiqdqwcneynpmwcpw ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 25 18:10:32 np0005535692.novalocal sudo[7177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:10:32 np0005535692.novalocal python3[7179]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Stopping Network Manager...
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[860]: <info>  [1764094232.7049] caught SIGTERM, shutting down normally.
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[860]: <info>  [1764094232.7063] dhcp4 (eth0): canceled DHCP transaction
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[860]: <info>  [1764094232.7063] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[860]: <info>  [1764094232.7063] dhcp4 (eth0): state changed no lease
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[860]: <info>  [1764094232.7067] manager: NetworkManager state is now CONNECTING
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[860]: <info>  [1764094232.7195] dhcp4 (eth1): canceled DHCP transaction
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[860]: <info>  [1764094232.7196] dhcp4 (eth1): state changed no lease
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[860]: <info>  [1764094232.7506] exiting (success)
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Stopped Network Manager.
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: NetworkManager.service: Consumed 1.193s CPU time, 10.0M memory peak.
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Starting Network Manager...
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.8253] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:5314dbd0-f0d3-4d8c-818c-96beee19bec6)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.8255] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.8324] manager[0x5596acbde070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Starting Hostname Service...
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Started Hostname Service.
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9399] hostname: hostname: using hostnamed
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9400] hostname: static hostname changed from (none) to "np0005535692.novalocal"
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9407] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9414] manager[0x5596acbde070]: rfkill: Wi-Fi hardware radio set enabled
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9415] manager[0x5596acbde070]: rfkill: WWAN hardware radio set enabled
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9459] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9460] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9461] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9462] manager: Networking is enabled by state file
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9465] settings: Loaded settings plugin: keyfile (internal)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9471] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9519] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9535] dhcp: init: Using DHCP client 'internal'
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9539] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9546] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9555] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9568] device (lo): Activation: starting connection 'lo' (7ebbea74-d1bb-4fb2-acd3-42edf212bfe7)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9579] device (eth0): carrier: link connected
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9586] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9593] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9594] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9605] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9615] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9624] device (eth1): carrier: link connected
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9631] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9637] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (52d4efbe-1a37-380b-ae77-c8d713488d53) (indicated)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9638] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9646] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9656] device (eth1): Activation: starting connection 'Wired connection 1' (52d4efbe-1a37-380b-ae77-c8d713488d53)
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Started Network Manager.
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9663] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9669] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9672] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9676] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9680] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9689] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9694] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9697] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9701] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9709] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9712] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9721] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9723] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9737] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9743] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9749] device (lo): Activation: successful, device activated.
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9758] dhcp4 (eth0): state changed new lease, address=38.102.83.177
Nov 25 18:10:32 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094232.9766] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 25 18:10:32 np0005535692.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 25 18:10:32 np0005535692.novalocal sudo[7177]: pam_unix(sudo:session): session closed for user root
Nov 25 18:10:33 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094233.0166] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 18:10:33 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094233.0187] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 18:10:33 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094233.0189] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 18:10:33 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094233.0193] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 18:10:33 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094233.0200] device (eth0): Activation: successful, device activated.
Nov 25 18:10:33 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094233.0207] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 25 18:10:33 np0005535692.novalocal python3[7263]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-d183-9b28-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:10:43 np0005535692.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 18:11:02 np0005535692.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.2797] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 18:11:18 np0005535692.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 18:11:18 np0005535692.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3097] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3101] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3115] device (eth1): Activation: successful, device activated.
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3126] manager: startup complete
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3129] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <warn>  [1764094278.3137] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3149] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 25 18:11:18 np0005535692.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3215] dhcp4 (eth1): canceled DHCP transaction
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3215] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3215] dhcp4 (eth1): state changed no lease
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3245] policy: auto-activating connection 'ci-private-network' (71bc06b1-39da-5e71-b6b2-29261e1233ba)
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3252] device (eth1): Activation: starting connection 'ci-private-network' (71bc06b1-39da-5e71-b6b2-29261e1233ba)
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3255] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3262] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3276] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3291] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3826] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3829] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:11:18 np0005535692.novalocal NetworkManager[7193]: <info>  [1764094278.3843] device (eth1): Activation: successful, device activated.
Nov 25 18:11:28 np0005535692.novalocal systemd[4305]: Starting Mark boot as successful...
Nov 25 18:11:28 np0005535692.novalocal systemd[4305]: Finished Mark boot as successful.
Nov 25 18:11:28 np0005535692.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 18:11:33 np0005535692.novalocal sshd-session[4314]: Received disconnect from 38.102.83.114 port 46040:11: disconnected by user
Nov 25 18:11:33 np0005535692.novalocal sshd-session[4314]: Disconnected from user zuul 38.102.83.114 port 46040
Nov 25 18:11:33 np0005535692.novalocal sshd-session[4301]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:11:33 np0005535692.novalocal systemd-logind[820]: Session 1 logged out. Waiting for processes to exit.
Nov 25 18:11:56 np0005535692.novalocal sshd-session[7293]: Accepted publickey for zuul from 38.102.83.114 port 34792 ssh2: RSA SHA256:jYBcT1icRquFQigknn/K3KSao1vKrqzJ1yq0uAHq9V0
Nov 25 18:11:56 np0005535692.novalocal systemd-logind[820]: New session 3 of user zuul.
Nov 25 18:11:56 np0005535692.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 25 18:11:56 np0005535692.novalocal sshd-session[7293]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:11:56 np0005535692.novalocal sudo[7372]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsyiytevknmcsgfjgetdnscledxsvyfy ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 25 18:11:56 np0005535692.novalocal sudo[7372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:11:56 np0005535692.novalocal python3[7374]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:11:56 np0005535692.novalocal sudo[7372]: pam_unix(sudo:session): session closed for user root
Nov 25 18:11:57 np0005535692.novalocal sudo[7445]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqmfngpqmmulsljiindbmqbyqeycmwxc ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 25 18:11:57 np0005535692.novalocal sudo[7445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:11:57 np0005535692.novalocal python3[7447]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764094316.5099423-312-230372240619854/source _original_basename=tmpa3fx4bkx follow=False checksum=43232ddd569091af186347d270c891fd4eedea6d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:11:57 np0005535692.novalocal sudo[7445]: pam_unix(sudo:session): session closed for user root
Nov 25 18:12:00 np0005535692.novalocal sshd-session[7296]: Connection closed by 38.102.83.114 port 34792
Nov 25 18:12:00 np0005535692.novalocal sshd-session[7293]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:12:00 np0005535692.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 25 18:12:00 np0005535692.novalocal systemd-logind[820]: Session 3 logged out. Waiting for processes to exit.
Nov 25 18:12:00 np0005535692.novalocal systemd-logind[820]: Removed session 3.
Nov 25 18:14:28 np0005535692.novalocal systemd[4305]: Created slice User Background Tasks Slice.
Nov 25 18:14:28 np0005535692.novalocal systemd[4305]: Starting Cleanup of User's Temporary Files and Directories...
Nov 25 18:14:28 np0005535692.novalocal systemd[4305]: Finished Cleanup of User's Temporary Files and Directories.
Nov 25 18:15:19 np0005535692.novalocal sshd-session[7478]: Connection closed by authenticating user root 171.244.51.45 port 55664 [preauth]
Nov 25 18:16:50 np0005535692.novalocal sshd-session[7481]: Accepted publickey for zuul from 38.102.83.114 port 45470 ssh2: RSA SHA256:jYBcT1icRquFQigknn/K3KSao1vKrqzJ1yq0uAHq9V0
Nov 25 18:16:50 np0005535692.novalocal systemd-logind[820]: New session 4 of user zuul.
Nov 25 18:16:50 np0005535692.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 25 18:16:50 np0005535692.novalocal sshd-session[7481]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:16:50 np0005535692.novalocal sudo[7508]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svxjeotulzqmvhmnwkwhegeloqrvvird ; /usr/bin/python3'
Nov 25 18:16:50 np0005535692.novalocal sudo[7508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:50 np0005535692.novalocal python3[7510]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-e371-1977-000000001ccd-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:16:50 np0005535692.novalocal sudo[7508]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:51 np0005535692.novalocal sudo[7537]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqqizzjudakjvsvjybqvthlsvrxetfdp ; /usr/bin/python3'
Nov 25 18:16:51 np0005535692.novalocal sudo[7537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:51 np0005535692.novalocal python3[7539]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:16:51 np0005535692.novalocal sudo[7537]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:51 np0005535692.novalocal sudo[7563]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abgyzojncgfmlreydiixnnzrrommjksf ; /usr/bin/python3'
Nov 25 18:16:51 np0005535692.novalocal sudo[7563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:51 np0005535692.novalocal python3[7565]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:16:51 np0005535692.novalocal sudo[7563]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:51 np0005535692.novalocal sudo[7589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tohkakabdgqwpyluypkphivdbtdrpblc ; /usr/bin/python3'
Nov 25 18:16:51 np0005535692.novalocal sudo[7589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:51 np0005535692.novalocal python3[7591]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:16:51 np0005535692.novalocal sudo[7589]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:52 np0005535692.novalocal sudo[7615]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjnrnvdxbitfntzyrwnlcxrybxxhkhcm ; /usr/bin/python3'
Nov 25 18:16:52 np0005535692.novalocal sudo[7615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:52 np0005535692.novalocal python3[7617]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:16:52 np0005535692.novalocal sudo[7615]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:52 np0005535692.novalocal sudo[7641]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkbknrctbkiqwtihdoapfrbcfoxajqjc ; /usr/bin/python3'
Nov 25 18:16:52 np0005535692.novalocal sudo[7641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:52 np0005535692.novalocal python3[7643]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:16:52 np0005535692.novalocal sudo[7641]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:53 np0005535692.novalocal sudo[7719]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqcxxdtzuercddocbtlaavgspnebqyxe ; /usr/bin/python3'
Nov 25 18:16:53 np0005535692.novalocal sudo[7719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:53 np0005535692.novalocal python3[7721]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:16:53 np0005535692.novalocal sudo[7719]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:53 np0005535692.novalocal sudo[7792]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khvnfupmfhakmsbkrxlpeteqsecefgwq ; /usr/bin/python3'
Nov 25 18:16:53 np0005535692.novalocal sudo[7792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:53 np0005535692.novalocal python3[7794]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764094612.9851599-488-250519178354636/source _original_basename=tmpyxrdi8b2 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:16:53 np0005535692.novalocal sudo[7792]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:54 np0005535692.novalocal sudo[7842]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uufbnimvapysqvbbqwcffdbsghhfoyvs ; /usr/bin/python3'
Nov 25 18:16:54 np0005535692.novalocal sudo[7842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:54 np0005535692.novalocal python3[7844]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:16:54 np0005535692.novalocal systemd[1]: Reloading.
Nov 25 18:16:54 np0005535692.novalocal systemd-rc-local-generator[7867]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:16:54 np0005535692.novalocal sudo[7842]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:55 np0005535692.novalocal sudo[7898]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mldfpioptihracbwmuvptnuyrtwwwfdd ; /usr/bin/python3'
Nov 25 18:16:56 np0005535692.novalocal sudo[7898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:56 np0005535692.novalocal python3[7900]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 25 18:16:56 np0005535692.novalocal sudo[7898]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:56 np0005535692.novalocal sudo[7924]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vedbduwwiierqaktvwvwkhnpjstwcgxc ; /usr/bin/python3'
Nov 25 18:16:56 np0005535692.novalocal sudo[7924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:56 np0005535692.novalocal python3[7926]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:16:56 np0005535692.novalocal sudo[7924]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:56 np0005535692.novalocal sudo[7952]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rccgskethvlrevkpfsrtngbkhqohricc ; /usr/bin/python3'
Nov 25 18:16:56 np0005535692.novalocal sudo[7952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:56 np0005535692.novalocal python3[7954]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:16:56 np0005535692.novalocal sudo[7952]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:56 np0005535692.novalocal sudo[7980]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjzaiuzmktuzwtfbesfemvfwtpdsojne ; /usr/bin/python3'
Nov 25 18:16:56 np0005535692.novalocal sudo[7980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:57 np0005535692.novalocal python3[7982]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:16:57 np0005535692.novalocal sudo[7980]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:57 np0005535692.novalocal sudo[8008]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfvuuskrcrjihjimrnrpuzlyigyftrfs ; /usr/bin/python3'
Nov 25 18:16:57 np0005535692.novalocal sudo[8008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:16:57 np0005535692.novalocal python3[8010]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:16:57 np0005535692.novalocal sudo[8008]: pam_unix(sudo:session): session closed for user root
Nov 25 18:16:58 np0005535692.novalocal python3[8037]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-e371-1977-000000001cd4-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:16:58 np0005535692.novalocal python3[8067]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 25 18:17:00 np0005535692.novalocal sshd-session[7484]: Connection closed by 38.102.83.114 port 45470
Nov 25 18:17:00 np0005535692.novalocal sshd-session[7481]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:17:00 np0005535692.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 25 18:17:00 np0005535692.novalocal systemd[1]: session-4.scope: Consumed 4.787s CPU time.
Nov 25 18:17:00 np0005535692.novalocal systemd-logind[820]: Session 4 logged out. Waiting for processes to exit.
Nov 25 18:17:00 np0005535692.novalocal systemd-logind[820]: Removed session 4.
Nov 25 18:17:02 np0005535692.novalocal sshd-session[8074]: Accepted publickey for zuul from 38.102.83.114 port 46104 ssh2: RSA SHA256:jYBcT1icRquFQigknn/K3KSao1vKrqzJ1yq0uAHq9V0
Nov 25 18:17:02 np0005535692.novalocal systemd-logind[820]: New session 5 of user zuul.
Nov 25 18:17:02 np0005535692.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 25 18:17:02 np0005535692.novalocal sshd-session[8074]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:17:02 np0005535692.novalocal sudo[8101]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eltrbfrnrmybhuhjqfgqdgdrbmxzrlxf ; /usr/bin/python3'
Nov 25 18:17:02 np0005535692.novalocal sudo[8101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:17:03 np0005535692.novalocal python3[8103]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 25 18:17:20 np0005535692.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 25 18:17:20 np0005535692.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:17:20 np0005535692.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 25 18:17:20 np0005535692.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:17:20 np0005535692.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:17:20 np0005535692.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:17:20 np0005535692.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:17:20 np0005535692.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:17:30 np0005535692.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 25 18:17:30 np0005535692.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:17:30 np0005535692.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 25 18:17:30 np0005535692.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:17:30 np0005535692.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:17:30 np0005535692.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:17:30 np0005535692.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:17:30 np0005535692.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:17:41 np0005535692.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 25 18:17:41 np0005535692.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:17:41 np0005535692.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 25 18:17:41 np0005535692.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:17:41 np0005535692.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:17:41 np0005535692.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:17:41 np0005535692.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:17:41 np0005535692.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:17:42 np0005535692.novalocal setsebool[8171]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 25 18:17:42 np0005535692.novalocal setsebool[8171]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 25 18:17:53 np0005535692.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 25 18:17:53 np0005535692.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:17:53 np0005535692.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 25 18:17:53 np0005535692.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:17:53 np0005535692.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:17:53 np0005535692.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:17:53 np0005535692.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:17:53 np0005535692.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:18:13 np0005535692.novalocal dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 25 18:18:13 np0005535692.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 18:18:13 np0005535692.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 25 18:18:13 np0005535692.novalocal systemd[1]: Reloading.
Nov 25 18:18:13 np0005535692.novalocal systemd-rc-local-generator[8926]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:18:14 np0005535692.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 18:18:15 np0005535692.novalocal sudo[8101]: pam_unix(sudo:session): session closed for user root
Nov 25 18:18:18 np0005535692.novalocal python3[12012]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-5bff-71a9-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:18:19 np0005535692.novalocal irqbalance[819]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 25 18:18:19 np0005535692.novalocal irqbalance[819]: IRQ 27 affinity is now unmanaged
Nov 25 18:18:19 np0005535692.novalocal kernel: evm: overlay not supported
Nov 25 18:18:19 np0005535692.novalocal systemd[4305]: Starting D-Bus User Message Bus...
Nov 25 18:18:19 np0005535692.novalocal dbus-broker-launch[12676]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 25 18:18:19 np0005535692.novalocal dbus-broker-launch[12676]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 25 18:18:19 np0005535692.novalocal systemd[4305]: Started D-Bus User Message Bus.
Nov 25 18:18:19 np0005535692.novalocal dbus-broker-lau[12676]: Ready
Nov 25 18:18:19 np0005535692.novalocal systemd[4305]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 25 18:18:19 np0005535692.novalocal systemd[4305]: Created slice Slice /user.
Nov 25 18:18:19 np0005535692.novalocal systemd[4305]: podman-12556.scope: unit configures an IP firewall, but not running as root.
Nov 25 18:18:19 np0005535692.novalocal systemd[4305]: (This warning is only shown for the first unit using IP firewalling.)
Nov 25 18:18:19 np0005535692.novalocal systemd[4305]: Started podman-12556.scope.
Nov 25 18:18:19 np0005535692.novalocal systemd[4305]: Started podman-pause-9a598e09.scope.
Nov 25 18:18:20 np0005535692.novalocal sudo[13109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpqzqurekimtulpnjqexytcipyswuqvn ; /usr/bin/python3'
Nov 25 18:18:20 np0005535692.novalocal sudo[13109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:18:20 np0005535692.novalocal python3[13134]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.27:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.27:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:18:20 np0005535692.novalocal python3[13134]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 25 18:18:20 np0005535692.novalocal sudo[13109]: pam_unix(sudo:session): session closed for user root
Nov 25 18:18:20 np0005535692.novalocal sshd-session[8077]: Connection closed by 38.102.83.114 port 46104
Nov 25 18:18:20 np0005535692.novalocal sshd-session[8074]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:18:20 np0005535692.novalocal systemd-logind[820]: Session 5 logged out. Waiting for processes to exit.
Nov 25 18:18:20 np0005535692.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 25 18:18:20 np0005535692.novalocal systemd[1]: session-5.scope: Consumed 1min 3.970s CPU time.
Nov 25 18:18:20 np0005535692.novalocal systemd-logind[820]: Removed session 5.
Nov 25 18:18:38 np0005535692.novalocal sshd-session[19273]: Unable to negotiate with 38.102.83.130 port 59560: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 25 18:18:38 np0005535692.novalocal sshd-session[19277]: Connection closed by 38.102.83.130 port 59550 [preauth]
Nov 25 18:18:38 np0005535692.novalocal sshd-session[19274]: Unable to negotiate with 38.102.83.130 port 59578: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 25 18:18:38 np0005535692.novalocal sshd-session[19280]: Connection closed by 38.102.83.130 port 59546 [preauth]
Nov 25 18:18:38 np0005535692.novalocal sshd-session[19276]: Unable to negotiate with 38.102.83.130 port 59576: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 25 18:18:43 np0005535692.novalocal sshd-session[20766]: Accepted publickey for zuul from 38.102.83.114 port 56038 ssh2: RSA SHA256:jYBcT1icRquFQigknn/K3KSao1vKrqzJ1yq0uAHq9V0
Nov 25 18:18:43 np0005535692.novalocal systemd-logind[820]: New session 6 of user zuul.
Nov 25 18:18:43 np0005535692.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 25 18:18:43 np0005535692.novalocal sshd-session[20766]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:18:43 np0005535692.novalocal python3[20872]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA7MasSjBhRML5WqL47zAfpNxnhUPEiVXKKvh40KEsXaIoXRnon/W1HKZ+kkhbklR4Qq1oIDd5gdyRfpyWTOyzY= zuul@np0005535691.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:18:44 np0005535692.novalocal sudo[21024]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgddwitlrsumzyqcyzbcmcwmriwdiyko ; /usr/bin/python3'
Nov 25 18:18:44 np0005535692.novalocal sudo[21024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:18:44 np0005535692.novalocal python3[21035]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA7MasSjBhRML5WqL47zAfpNxnhUPEiVXKKvh40KEsXaIoXRnon/W1HKZ+kkhbklR4Qq1oIDd5gdyRfpyWTOyzY= zuul@np0005535691.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:18:44 np0005535692.novalocal sudo[21024]: pam_unix(sudo:session): session closed for user root
Nov 25 18:18:45 np0005535692.novalocal sudo[21345]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-polkgmnycgjhfcygkcopndwrabxbnyys ; /usr/bin/python3'
Nov 25 18:18:45 np0005535692.novalocal sudo[21345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:18:45 np0005535692.novalocal python3[21357]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005535692.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 25 18:18:45 np0005535692.novalocal useradd[21427]: new group: name=cloud-admin, GID=1002
Nov 25 18:18:45 np0005535692.novalocal useradd[21427]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 25 18:18:45 np0005535692.novalocal sudo[21345]: pam_unix(sudo:session): session closed for user root
Nov 25 18:18:45 np0005535692.novalocal sudo[21559]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrwfstrifphvpkqkhvawdilvxavbaunb ; /usr/bin/python3'
Nov 25 18:18:45 np0005535692.novalocal sudo[21559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:18:45 np0005535692.novalocal python3[21570]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA7MasSjBhRML5WqL47zAfpNxnhUPEiVXKKvh40KEsXaIoXRnon/W1HKZ+kkhbklR4Qq1oIDd5gdyRfpyWTOyzY= zuul@np0005535691.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 25 18:18:45 np0005535692.novalocal sudo[21559]: pam_unix(sudo:session): session closed for user root
Nov 25 18:18:46 np0005535692.novalocal sudo[21815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dafbjhqepkwfpmoprhxvmqwcbqzhelly ; /usr/bin/python3'
Nov 25 18:18:46 np0005535692.novalocal sudo[21815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:18:46 np0005535692.novalocal python3[21825]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:18:46 np0005535692.novalocal sudo[21815]: pam_unix(sudo:session): session closed for user root
Nov 25 18:18:46 np0005535692.novalocal sudo[22057]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skhszthgfsusjbzloxfcrdlnrktuogwd ; /usr/bin/python3'
Nov 25 18:18:46 np0005535692.novalocal sudo[22057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:18:46 np0005535692.novalocal python3[22064]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764094726.0074992-151-267702150937275/source _original_basename=tmpl4s22_4d follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:18:46 np0005535692.novalocal sudo[22057]: pam_unix(sudo:session): session closed for user root
Nov 25 18:18:47 np0005535692.novalocal sudo[22368]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajtrlufzopiabushmylxgbojofuzyphh ; /usr/bin/python3'
Nov 25 18:18:47 np0005535692.novalocal sudo[22368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:18:47 np0005535692.novalocal python3[22379]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Nov 25 18:18:47 np0005535692.novalocal systemd[1]: Starting Hostname Service...
Nov 25 18:18:47 np0005535692.novalocal systemd[1]: Started Hostname Service.
Nov 25 18:18:47 np0005535692.novalocal systemd-hostnamed[22472]: Changed pretty hostname to 'compute-0'
Nov 25 18:18:47 compute-0 systemd-hostnamed[22472]: Hostname set to <compute-0> (static)
Nov 25 18:18:47 compute-0 NetworkManager[7193]: <info>  [1764094727.9775] hostname: static hostname changed from "np0005535692.novalocal" to "compute-0"
Nov 25 18:18:48 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 18:18:48 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 18:18:48 compute-0 sudo[22368]: pam_unix(sudo:session): session closed for user root
Nov 25 18:18:48 compute-0 sshd-session[20814]: Connection closed by 38.102.83.114 port 56038
Nov 25 18:18:48 compute-0 sshd-session[20766]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:18:48 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Nov 25 18:18:48 compute-0 systemd[1]: session-6.scope: Consumed 2.736s CPU time.
Nov 25 18:18:48 compute-0 systemd-logind[820]: Session 6 logged out. Waiting for processes to exit.
Nov 25 18:18:48 compute-0 systemd-logind[820]: Removed session 6.
Nov 25 18:18:58 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 18:19:13 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 18:19:13 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 18:19:13 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1min 10.255s CPU time.
Nov 25 18:19:13 compute-0 systemd[1]: run-r741ad780d1284ebeb4653cf53e585ba8.service: Deactivated successfully.
Nov 25 18:19:18 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 18:19:54 compute-0 sshd-session[29927]: Connection closed by authenticating user root 171.244.51.45 port 46496 [preauth]
Nov 25 18:22:46 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 25 18:22:46 compute-0 sshd-session[29933]: Accepted publickey for zuul from 38.102.83.130 port 43176 ssh2: RSA SHA256:jYBcT1icRquFQigknn/K3KSao1vKrqzJ1yq0uAHq9V0
Nov 25 18:22:46 compute-0 systemd-logind[820]: New session 7 of user zuul.
Nov 25 18:22:46 compute-0 systemd[1]: Started Session 7 of User zuul.
Nov 25 18:22:46 compute-0 sshd-session[29933]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:22:46 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 25 18:22:46 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 25 18:22:46 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 25 18:22:46 compute-0 python3[30011]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:22:48 compute-0 sudo[30125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onkdpkeabiuohnnpzorqhwnijrrukozs ; /usr/bin/python3'
Nov 25 18:22:48 compute-0 sudo[30125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:48 compute-0 python3[30127]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:22:48 compute-0 sudo[30125]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:49 compute-0 sudo[30198]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeoptywrrkuixdcipqwhtqifhczkuhbf ; /usr/bin/python3'
Nov 25 18:22:49 compute-0 sudo[30198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:49 compute-0 python3[30200]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094968.5309587-33766-271488645970517/source mode=0755 _original_basename=delorean.repo follow=False checksum=eecfec7997156c379dd1b108228e60ea22f5b806 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:22:49 compute-0 sudo[30198]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:50 compute-0 sudo[30224]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awosdsmvngsyekoyovqzjuhbflixzanv ; /usr/bin/python3'
Nov 25 18:22:50 compute-0 sudo[30224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:50 compute-0 python3[30226]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:22:50 compute-0 sudo[30224]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:50 compute-0 sudo[30297]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cavwmbzfuzsycxwdvbdroabaomjuzaaz ; /usr/bin/python3'
Nov 25 18:22:50 compute-0 sudo[30297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:50 compute-0 python3[30299]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094968.5309587-33766-271488645970517/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=c22157e85d05af7ffbafa054f80958446d397a41 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:22:50 compute-0 sudo[30297]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:51 compute-0 sudo[30323]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnqypffgypddolfmmzysvzqtyyfdzdel ; /usr/bin/python3'
Nov 25 18:22:51 compute-0 sudo[30323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:51 compute-0 python3[30325]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:22:51 compute-0 sudo[30323]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:51 compute-0 sudo[30396]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofxixgbqyzvjmcdmwpwqagvoqpvidysk ; /usr/bin/python3'
Nov 25 18:22:51 compute-0 sudo[30396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:51 compute-0 python3[30398]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094968.5309587-33766-271488645970517/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:22:51 compute-0 sudo[30396]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:51 compute-0 sudo[30422]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssnxmwvisdafexfcymtzvavxinbxruel ; /usr/bin/python3'
Nov 25 18:22:51 compute-0 sudo[30422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:51 compute-0 python3[30424]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:22:51 compute-0 sudo[30422]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:52 compute-0 sudo[30495]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmjmfqkfzhlvpqcrembyyelesbdcrjnw ; /usr/bin/python3'
Nov 25 18:22:52 compute-0 sudo[30495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:52 compute-0 python3[30497]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094968.5309587-33766-271488645970517/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:22:52 compute-0 sudo[30495]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:52 compute-0 sudo[30521]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfvwvpcqtaaxqbjcnmajwohhpmyvyilx ; /usr/bin/python3'
Nov 25 18:22:52 compute-0 sudo[30521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:52 compute-0 python3[30523]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:22:52 compute-0 sudo[30521]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:53 compute-0 sudo[30594]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pijnmstvcpbbwqmasonwfubhtvalyysc ; /usr/bin/python3'
Nov 25 18:22:53 compute-0 sudo[30594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:53 compute-0 python3[30596]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094968.5309587-33766-271488645970517/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:22:53 compute-0 sudo[30594]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:53 compute-0 sudo[30620]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-measobzvreiixmiixlmsgrdqatlygpji ; /usr/bin/python3'
Nov 25 18:22:53 compute-0 sudo[30620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:53 compute-0 python3[30622]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:22:53 compute-0 sudo[30620]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:53 compute-0 sudo[30693]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsoxwzdvlfjerikbztftsfpuzszikvcl ; /usr/bin/python3'
Nov 25 18:22:53 compute-0 sudo[30693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:53 compute-0 python3[30695]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094968.5309587-33766-271488645970517/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:22:53 compute-0 sudo[30693]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:54 compute-0 sudo[30719]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmusezqaquqerjplafeowzvwxaazepms ; /usr/bin/python3'
Nov 25 18:22:54 compute-0 sudo[30719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:54 compute-0 python3[30721]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 18:22:54 compute-0 sudo[30719]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:54 compute-0 sudo[30792]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqpqiwqasatowaygldhleiqyvmrgrvul ; /usr/bin/python3'
Nov 25 18:22:54 compute-0 sudo[30792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:22:54 compute-0 python3[30794]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764094968.5309587-33766-271488645970517/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=0be7eb3bc4775787fd2a5a7ac7bcd314c8e050fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:22:54 compute-0 sudo[30792]: pam_unix(sudo:session): session closed for user root
Nov 25 18:22:57 compute-0 sshd-session[30820]: Unable to negotiate with 192.168.122.11 port 35764: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 25 18:22:57 compute-0 sshd-session[30819]: Connection closed by 192.168.122.11 port 35728 [preauth]
Nov 25 18:22:57 compute-0 sshd-session[30822]: Connection closed by 192.168.122.11 port 35736 [preauth]
Nov 25 18:22:57 compute-0 sshd-session[30821]: Unable to negotiate with 192.168.122.11 port 35740: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 25 18:22:57 compute-0 sshd-session[30823]: Unable to negotiate with 192.168.122.11 port 35750: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 25 18:24:12 compute-0 python3[30853]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:29:11 compute-0 sshd-session[29938]: Received disconnect from 38.102.83.130 port 43176:11: disconnected by user
Nov 25 18:29:11 compute-0 sshd-session[29938]: Disconnected from user zuul 38.102.83.130 port 43176
Nov 25 18:29:11 compute-0 sshd-session[29933]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:29:11 compute-0 systemd[1]: Starting dnf makecache...
Nov 25 18:29:11 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Nov 25 18:29:11 compute-0 systemd[1]: session-7.scope: Consumed 6.297s CPU time.
Nov 25 18:29:11 compute-0 systemd-logind[820]: Session 7 logged out. Waiting for processes to exit.
Nov 25 18:29:11 compute-0 systemd-logind[820]: Removed session 7.
Nov 25 18:29:11 compute-0 dnf[30857]: Failed determining last makecache time.
Nov 25 18:29:11 compute-0 dnf[30857]: delorean-python-castellan-609f4ea667df386849930 275 kB/s |  13 kB     00:00
Nov 25 18:29:11 compute-0 dnf[30857]: delorean-openstack-ironic-c525a16b06266b6b474c9 2.9 MB/s |  64 kB     00:00
Nov 25 18:29:11 compute-0 dnf[30857]: delorean-openstack-cinder-92c645f1f1e913b5b1cd8 1.2 MB/s |  30 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-ansible-collections-openstack-f584c54d 5.4 MB/s | 121 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-ceilometer-60803e710e7f5b3cd 934 kB/s |  24 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-kolla-e7bd46dad0b62ff151667b 9.4 MB/s | 274 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-nova-3e7017eb2952d5258d96e27 1.2 MB/s |  37 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-designate-82652559ea8641b11c 757 kB/s |  19 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-glance-e055873be4079bc9d3716 626 kB/s |  19 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-keystone-4f1b7e96e38463d5fcd 661 kB/s |  23 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-manila-70623bb84e7880f7f2f75 1.0 MB/s |  27 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-python-networking-mlnx-7139a7f0bce9d6a 4.0 MB/s | 130 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-octavia-e981d3e172b8e4471f97 930 kB/s |  25 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-watcher-71470dac73abba9e5dcf 698 kB/s |  17 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-python-tcib-c2ae956ec1898faaed6197ef95 420 kB/s | 7.9 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-puppet-magnum-ec92e647ad5e77720f01cce0 5.7 MB/s | 155 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-openstack-swift-e10c2bafcb8fc80929bce3 511 kB/s |  15 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-python-mistral-tests-tempest-900580c95 1.4 MB/s |  35 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: delorean-python-django-horizon-915b939b342dc65f 4.4 MB/s | 105 kB     00:00
Nov 25 18:29:12 compute-0 dnf[30857]: CentOS Stream 9 - BaseOS                         72 kB/s | 6.7 kB     00:00
Nov 25 18:29:13 compute-0 dnf[30857]: CentOS Stream 9 - AppStream                      42 kB/s | 6.8 kB     00:00
Nov 25 18:29:13 compute-0 dnf[30857]: CentOS Stream 9 - CRB                            68 kB/s | 6.5 kB     00:00
Nov 25 18:29:13 compute-0 dnf[30857]: CentOS Stream 9 - Extras packages                70 kB/s | 8.3 kB     00:00
Nov 25 18:29:13 compute-0 dnf[30857]: dlrn-master-testing                              43 MB/s | 2.4 MB     00:00
Nov 25 18:29:14 compute-0 dnf[30857]: dlrn-master-build-deps                           22 MB/s | 516 kB     00:00
Nov 25 18:29:14 compute-0 dnf[30857]: centos9-rabbitmq                                8.3 MB/s | 123 kB     00:00
Nov 25 18:29:14 compute-0 dnf[30857]: centos9-storage                                  24 MB/s | 415 kB     00:00
Nov 25 18:29:14 compute-0 dnf[30857]: centos9-opstools                                4.7 MB/s |  51 kB     00:00
Nov 25 18:29:14 compute-0 dnf[30857]: NFV SIG OpenvSwitch                              25 MB/s | 454 kB     00:00
Nov 25 18:29:15 compute-0 dnf[30857]: repo-setup-centos-appstream                     101 MB/s |  25 MB     00:00
Nov 25 18:29:20 compute-0 dnf[30857]: repo-setup-centos-baseos                         91 MB/s | 8.8 MB     00:00
Nov 25 18:29:22 compute-0 dnf[30857]: repo-setup-centos-highavailability               30 MB/s | 744 kB     00:00
Nov 25 18:29:22 compute-0 dnf[30857]: repo-setup-centos-powertools                     92 MB/s | 7.3 MB     00:00
Nov 25 18:29:25 compute-0 dnf[30857]: Extra Packages for Enterprise Linux 9 - x86_64   15 MB/s |  20 MB     00:01
Nov 25 18:29:36 compute-0 dnf[30857]: Metadata cache created.
Nov 25 18:29:36 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 25 18:29:36 compute-0 systemd[1]: Finished dnf makecache.
Nov 25 18:29:36 compute-0 systemd[1]: dnf-makecache.service: Consumed 22.940s CPU time.
Nov 25 18:36:42 compute-0 sshd-session[30963]: Accepted publickey for zuul from 192.168.122.30 port 41682 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:36:42 compute-0 systemd-logind[820]: New session 8 of user zuul.
Nov 25 18:36:42 compute-0 systemd[1]: Started Session 8 of User zuul.
Nov 25 18:36:42 compute-0 sshd-session[30963]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:36:43 compute-0 python3.9[31116]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:36:45 compute-0 sudo[31295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eefttwaybqkdanozweytfllnlvrdnjci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095804.5879145-44-37306961804339/AnsiballZ_command.py'
Nov 25 18:36:45 compute-0 sudo[31295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:36:45 compute-0 python3.9[31297]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:36:53 compute-0 sudo[31295]: pam_unix(sudo:session): session closed for user root
Nov 25 18:36:53 compute-0 sshd-session[30966]: Connection closed by 192.168.122.30 port 41682
Nov 25 18:36:53 compute-0 sshd-session[30963]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:36:53 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Nov 25 18:36:53 compute-0 systemd[1]: session-8.scope: Consumed 8.743s CPU time.
Nov 25 18:36:53 compute-0 systemd-logind[820]: Session 8 logged out. Waiting for processes to exit.
Nov 25 18:36:53 compute-0 systemd-logind[820]: Removed session 8.
Nov 25 18:36:58 compute-0 sshd-session[31354]: Accepted publickey for zuul from 192.168.122.30 port 48434 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:36:58 compute-0 systemd-logind[820]: New session 9 of user zuul.
Nov 25 18:36:58 compute-0 systemd[1]: Started Session 9 of User zuul.
Nov 25 18:36:58 compute-0 sshd-session[31354]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:36:59 compute-0 python3.9[31507]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:36:59 compute-0 sshd-session[31357]: Connection closed by 192.168.122.30 port 48434
Nov 25 18:36:59 compute-0 sshd-session[31354]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:36:59 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Nov 25 18:36:59 compute-0 systemd-logind[820]: Session 9 logged out. Waiting for processes to exit.
Nov 25 18:36:59 compute-0 systemd-logind[820]: Removed session 9.
Nov 25 18:37:15 compute-0 sshd-session[31535]: Accepted publickey for zuul from 192.168.122.30 port 35140 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:37:15 compute-0 systemd-logind[820]: New session 10 of user zuul.
Nov 25 18:37:15 compute-0 systemd[1]: Started Session 10 of User zuul.
Nov 25 18:37:15 compute-0 sshd-session[31535]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:37:16 compute-0 python3.9[31688]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 25 18:37:18 compute-0 python3.9[31862]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:37:18 compute-0 sudo[32012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odhlkqprzxbqcxcfbatbwrxhoffpffvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095838.4288197-69-94059660337630/AnsiballZ_command.py'
Nov 25 18:37:18 compute-0 sudo[32012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:19 compute-0 python3.9[32014]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:37:19 compute-0 sudo[32012]: pam_unix(sudo:session): session closed for user root
Nov 25 18:37:20 compute-0 sudo[32165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-figbhupxqoldgpevykxqklqwuqhgggnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095839.6244178-93-48444278480046/AnsiballZ_stat.py'
Nov 25 18:37:20 compute-0 sudo[32165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:20 compute-0 python3.9[32167]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:37:20 compute-0 sudo[32165]: pam_unix(sudo:session): session closed for user root
Nov 25 18:37:21 compute-0 sudo[32317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kffbgzxbpicpfmhemxrazpmuvhonyzys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095840.5190933-109-40404441950756/AnsiballZ_file.py'
Nov 25 18:37:21 compute-0 sudo[32317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:21 compute-0 python3.9[32319]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:37:21 compute-0 sudo[32317]: pam_unix(sudo:session): session closed for user root
Nov 25 18:37:21 compute-0 sudo[32469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slvpvaumviegtqclyeffsbiwtkmieiyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095841.488086-125-182257188679309/AnsiballZ_stat.py'
Nov 25 18:37:21 compute-0 sudo[32469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:22 compute-0 python3.9[32471]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:37:22 compute-0 sudo[32469]: pam_unix(sudo:session): session closed for user root
Nov 25 18:37:22 compute-0 sudo[32592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omzkgypjanohqgysgphvumjgalxiudog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095841.488086-125-182257188679309/AnsiballZ_copy.py'
Nov 25 18:37:22 compute-0 sudo[32592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:22 compute-0 python3.9[32594]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764095841.488086-125-182257188679309/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:37:22 compute-0 sudo[32592]: pam_unix(sudo:session): session closed for user root
Nov 25 18:37:23 compute-0 sudo[32744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzzhluicjezxfyttfoubxdbcvhoavxff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095843.154579-155-120368440577772/AnsiballZ_setup.py'
Nov 25 18:37:23 compute-0 sudo[32744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:23 compute-0 python3.9[32746]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:37:24 compute-0 sudo[32744]: pam_unix(sudo:session): session closed for user root
Nov 25 18:37:24 compute-0 sudo[32900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxgkfycnhpbcorvisjlutxvfbmzlajpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095844.3430538-171-196964165946427/AnsiballZ_file.py'
Nov 25 18:37:24 compute-0 sudo[32900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:24 compute-0 python3.9[32902]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:37:24 compute-0 sudo[32900]: pam_unix(sudo:session): session closed for user root
Nov 25 18:37:25 compute-0 sudo[33052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhffhxyqwzqasxynvisrorvpvqnjvtio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095845.2076387-189-186012943805525/AnsiballZ_file.py'
Nov 25 18:37:25 compute-0 sudo[33052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:25 compute-0 python3.9[33054]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:37:25 compute-0 sudo[33052]: pam_unix(sudo:session): session closed for user root
Nov 25 18:37:26 compute-0 python3.9[33204]: ansible-ansible.builtin.service_facts Invoked
Nov 25 18:37:32 compute-0 python3.9[33457]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:37:33 compute-0 python3.9[33607]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:37:34 compute-0 python3.9[33761]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:37:35 compute-0 sudo[33917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqlaxjupkksfwvcpvvcjlmnjujngapdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095855.1077237-285-79390901049320/AnsiballZ_setup.py'
Nov 25 18:37:35 compute-0 sudo[33917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:35 compute-0 python3.9[33919]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:37:36 compute-0 sudo[33917]: pam_unix(sudo:session): session closed for user root
Nov 25 18:37:36 compute-0 sudo[34001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amwgtndcpvtjewxgtmbanoacuplpgdfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095855.1077237-285-79390901049320/AnsiballZ_dnf.py'
Nov 25 18:37:36 compute-0 sudo[34001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:37:36 compute-0 python3.9[34003]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:38:17 compute-0 systemd[1]: Reloading.
Nov 25 18:38:18 compute-0 systemd-rc-local-generator[34197]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:38:18 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 25 18:38:18 compute-0 systemd[1]: Reloading.
Nov 25 18:38:18 compute-0 systemd-rc-local-generator[34235]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:38:18 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 25 18:38:18 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 25 18:38:18 compute-0 systemd[1]: Reloading.
Nov 25 18:38:18 compute-0 systemd-rc-local-generator[34279]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:38:18 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 25 18:38:19 compute-0 dbus-broker-launch[791]: Noticed file-system modification, trigger reload.
Nov 25 18:38:19 compute-0 dbus-broker-launch[791]: Noticed file-system modification, trigger reload.
Nov 25 18:38:19 compute-0 dbus-broker-launch[791]: Noticed file-system modification, trigger reload.
Nov 25 18:39:23 compute-0 kernel: SELinux:  Converting 2719 SID table entries...
Nov 25 18:39:23 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:39:23 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 18:39:23 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:39:23 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:39:23 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:39:23 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:39:23 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:39:23 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 25 18:39:23 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 18:39:23 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 18:39:23 compute-0 systemd[1]: Reloading.
Nov 25 18:39:23 compute-0 systemd-rc-local-generator[34599]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:39:23 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 18:39:24 compute-0 sudo[34001]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:25 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 18:39:25 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 18:39:25 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.703s CPU time.
Nov 25 18:39:25 compute-0 systemd[1]: run-r3a885e02f06e41c7a39798b83147d2f2.service: Deactivated successfully.
Nov 25 18:39:25 compute-0 sudo[35508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwdfahiytbtdvpijeuwbdhbwyihhvurt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095965.4709048-309-33475162219528/AnsiballZ_command.py'
Nov 25 18:39:25 compute-0 sudo[35508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:26 compute-0 python3.9[35510]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:39:27 compute-0 sudo[35508]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:28 compute-0 sudo[35789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdxloqctebolawnbxiabbozxazstpufq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095967.5295699-325-178739914787586/AnsiballZ_selinux.py'
Nov 25 18:39:28 compute-0 sudo[35789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:28 compute-0 python3.9[35791]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 25 18:39:28 compute-0 sudo[35789]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:29 compute-0 sudo[35941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liavuxjzmwtthkjoipexlkaqleudmpil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095968.9405446-347-171653451712985/AnsiballZ_command.py'
Nov 25 18:39:29 compute-0 sudo[35941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:29 compute-0 python3.9[35943]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 25 18:39:30 compute-0 sudo[35941]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:31 compute-0 sudo[36094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luvmtdkdflyzbnqoifatjgauxgiqnivg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095970.8224456-363-31457214722362/AnsiballZ_file.py'
Nov 25 18:39:31 compute-0 sudo[36094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:31 compute-0 python3.9[36096]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:39:31 compute-0 sudo[36094]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:32 compute-0 sudo[36246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-konggumwshwsbiccewiqhweqgvpnozml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095972.046258-379-85297324687322/AnsiballZ_mount.py'
Nov 25 18:39:32 compute-0 sudo[36246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:32 compute-0 python3.9[36248]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 25 18:39:32 compute-0 sudo[36246]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:34 compute-0 sudo[36398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deraeratslyxskjipaojktjmteziyfik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095973.6721754-435-61292297074186/AnsiballZ_file.py'
Nov 25 18:39:34 compute-0 sudo[36398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:34 compute-0 python3.9[36400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:39:34 compute-0 sudo[36398]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:34 compute-0 sudo[36550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aexldyqwsanhysksvurruangfjvmmkme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095974.5275757-451-124238748372893/AnsiballZ_stat.py'
Nov 25 18:39:34 compute-0 sudo[36550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:35 compute-0 python3.9[36552]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:39:35 compute-0 sudo[36550]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:35 compute-0 sudo[36673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxnvjahcnntehfbnjsafgsbaecgkoaju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095974.5275757-451-124238748372893/AnsiballZ_copy.py'
Nov 25 18:39:35 compute-0 sudo[36673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:35 compute-0 python3.9[36675]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764095974.5275757-451-124238748372893/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=110188508d39de0258c0959e3bc941a100e6a11a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:39:35 compute-0 sudo[36673]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:36 compute-0 sudo[36825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlidehthybdvxqgqrefufnhpzqhtnadv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095976.4910133-499-235627844895062/AnsiballZ_stat.py'
Nov 25 18:39:36 compute-0 sudo[36825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:38 compute-0 python3.9[36827]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:39:38 compute-0 sudo[36825]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:40 compute-0 sudo[36978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpeqhecglqatbpfleauctxwclhxursps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095979.6527112-515-165118917685363/AnsiballZ_command.py'
Nov 25 18:39:40 compute-0 sudo[36978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:40 compute-0 python3.9[36980]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:39:40 compute-0 sudo[36978]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:40 compute-0 sudo[37131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzohbhwoikiyykojyqyvwrxdbtdyurmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095980.6080854-531-111152265739598/AnsiballZ_file.py'
Nov 25 18:39:40 compute-0 sudo[37131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:41 compute-0 python3.9[37133]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:39:41 compute-0 sudo[37131]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:47 compute-0 sudo[37283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivwfgxykgnvoymeuwxmnmxlyazxzdbkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095986.6976426-553-193524847970358/AnsiballZ_getent.py'
Nov 25 18:39:47 compute-0 sudo[37283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:47 compute-0 python3.9[37285]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 25 18:39:47 compute-0 sudo[37283]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:47 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 18:39:48 compute-0 sudo[37437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgkvpyufwqqdpguurnxrivaxmnsdjkxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095987.7545907-569-10458298883071/AnsiballZ_group.py'
Nov 25 18:39:48 compute-0 sudo[37437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:48 compute-0 python3.9[37439]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 18:39:48 compute-0 groupadd[37440]: group added to /etc/group: name=qemu, GID=107
Nov 25 18:39:48 compute-0 groupadd[37440]: group added to /etc/gshadow: name=qemu
Nov 25 18:39:48 compute-0 groupadd[37440]: new group: name=qemu, GID=107
Nov 25 18:39:48 compute-0 sudo[37437]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:49 compute-0 sudo[37595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcysujcaoszlsmgwwugcdfyttslzsmub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095988.813497-585-237466804753968/AnsiballZ_user.py'
Nov 25 18:39:49 compute-0 sudo[37595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:49 compute-0 python3.9[37597]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 18:39:49 compute-0 useradd[37599]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 25 18:39:49 compute-0 sudo[37595]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:50 compute-0 sudo[37755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwqbcnrfdstalnqsaeijtpebkozjvpsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095990.2109401-601-4718281734791/AnsiballZ_getent.py'
Nov 25 18:39:50 compute-0 sudo[37755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:50 compute-0 python3.9[37757]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 25 18:39:50 compute-0 sudo[37755]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:51 compute-0 sudo[37908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfltmciuapfvvkjuaiklphblgbupxonv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095991.049052-617-231994456546347/AnsiballZ_group.py'
Nov 25 18:39:51 compute-0 sudo[37908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:51 compute-0 python3.9[37910]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 18:39:51 compute-0 groupadd[37911]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 25 18:39:51 compute-0 groupadd[37911]: group added to /etc/gshadow: name=hugetlbfs
Nov 25 18:39:51 compute-0 groupadd[37911]: new group: name=hugetlbfs, GID=42477
Nov 25 18:39:51 compute-0 sudo[37908]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:52 compute-0 sudo[38066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cggbmnzdfeajarccjxlifmazcwmdyzwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095992.0025191-635-89162030420676/AnsiballZ_file.py'
Nov 25 18:39:52 compute-0 sudo[38066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:52 compute-0 python3.9[38068]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 25 18:39:52 compute-0 sudo[38066]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:53 compute-0 sudo[38218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhqvyokruocbsnblihmjarqfgxjjhhvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095993.0532632-657-177920008314722/AnsiballZ_dnf.py'
Nov 25 18:39:53 compute-0 sudo[38218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:53 compute-0 python3.9[38220]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:39:55 compute-0 sudo[38218]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:55 compute-0 sudo[38371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewuqndhsqmclkzpgbonysoobtknapakt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095995.518537-673-79921889207696/AnsiballZ_file.py'
Nov 25 18:39:55 compute-0 sudo[38371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:56 compute-0 python3.9[38373]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:39:56 compute-0 sudo[38371]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:56 compute-0 sudo[38523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldzrzltmyzybtujdesbynvpzupxjvxwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095996.3593369-689-50512158797056/AnsiballZ_stat.py'
Nov 25 18:39:56 compute-0 sudo[38523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:56 compute-0 python3.9[38525]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:39:56 compute-0 sudo[38523]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:57 compute-0 sudo[38646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikfmgykjrtbxzrlxwolqxhrceymdclze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095996.3593369-689-50512158797056/AnsiballZ_copy.py'
Nov 25 18:39:57 compute-0 sudo[38646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:57 compute-0 python3.9[38648]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095996.3593369-689-50512158797056/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:39:57 compute-0 sudo[38646]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:58 compute-0 sudo[38798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amxyxiyzleabzymrbgasncpkhtshylmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095997.934664-719-262896209191181/AnsiballZ_systemd.py'
Nov 25 18:39:58 compute-0 sudo[38798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:58 compute-0 python3.9[38800]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:39:59 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 25 18:39:59 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 25 18:39:59 compute-0 kernel: Bridge firewalling registered
Nov 25 18:39:59 compute-0 systemd-modules-load[38804]: Inserted module 'br_netfilter'
Nov 25 18:39:59 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 25 18:39:59 compute-0 sudo[38798]: pam_unix(sudo:session): session closed for user root
Nov 25 18:39:59 compute-0 sudo[38959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjrcwntzlxexjpiaoofpapqgugurnhtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095999.3877673-735-271875601247575/AnsiballZ_stat.py'
Nov 25 18:39:59 compute-0 sudo[38959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:39:59 compute-0 python3.9[38961]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:39:59 compute-0 sudo[38959]: pam_unix(sudo:session): session closed for user root
Nov 25 18:40:00 compute-0 sudo[39082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhsoednlwracivltkdocolttzcokdpln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764095999.3877673-735-271875601247575/AnsiballZ_copy.py'
Nov 25 18:40:00 compute-0 sudo[39082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:40:00 compute-0 python3.9[39084]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764095999.3877673-735-271875601247575/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:40:00 compute-0 sudo[39082]: pam_unix(sudo:session): session closed for user root
Nov 25 18:40:01 compute-0 sudo[39234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-assvdskmvahibnptmatfgkdsogrvwryc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096001.3322873-771-18665236847795/AnsiballZ_dnf.py'
Nov 25 18:40:01 compute-0 sudo[39234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:40:01 compute-0 python3.9[39236]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:40:05 compute-0 dbus-broker-launch[791]: Noticed file-system modification, trigger reload.
Nov 25 18:40:05 compute-0 dbus-broker-launch[791]: Noticed file-system modification, trigger reload.
Nov 25 18:40:05 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 18:40:05 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 18:40:05 compute-0 systemd[1]: Reloading.
Nov 25 18:40:05 compute-0 systemd-rc-local-generator[39301]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:40:05 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 18:40:06 compute-0 sudo[39234]: pam_unix(sudo:session): session closed for user root
Nov 25 18:40:07 compute-0 python3.9[40463]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:40:08 compute-0 python3.9[41349]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 25 18:40:09 compute-0 python3.9[42112]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:40:09 compute-0 sudo[42988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-magnssdvtqanrwqscmnqkwqvftgisoyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096009.5290387-849-165012184408647/AnsiballZ_command.py'
Nov 25 18:40:09 compute-0 sudo[42988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:40:10 compute-0 python3.9[43013]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:40:10 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 25 18:40:10 compute-0 systemd[1]: Starting Authorization Manager...
Nov 25 18:40:10 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 25 18:40:10 compute-0 polkitd[43621]: Started polkitd version 0.117
Nov 25 18:40:10 compute-0 polkitd[43621]: Loading rules from directory /etc/polkit-1/rules.d
Nov 25 18:40:10 compute-0 polkitd[43621]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 25 18:40:10 compute-0 polkitd[43621]: Finished loading, compiling and executing 2 rules
Nov 25 18:40:10 compute-0 systemd[1]: Started Authorization Manager.
Nov 25 18:40:10 compute-0 polkitd[43621]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 25 18:40:11 compute-0 sudo[42988]: pam_unix(sudo:session): session closed for user root
Nov 25 18:40:11 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 18:40:11 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 18:40:11 compute-0 systemd[1]: man-db-cache-update.service: Consumed 6.405s CPU time.
Nov 25 18:40:11 compute-0 systemd[1]: run-r6bbe9d973f6943c78d93b0fafd5a0c74.service: Deactivated successfully.
Nov 25 18:40:11 compute-0 sudo[43790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrxiwssgmxpreguipmpoxfuoonxhthes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096011.6101305-867-273126198012219/AnsiballZ_systemd.py'
Nov 25 18:40:12 compute-0 sudo[43790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:40:12 compute-0 python3.9[43792]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:40:12 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 25 18:40:12 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Nov 25 18:40:12 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 25 18:40:12 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 25 18:40:12 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 25 18:40:12 compute-0 sudo[43790]: pam_unix(sudo:session): session closed for user root
Nov 25 18:40:13 compute-0 python3.9[43954]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 25 18:40:16 compute-0 sudo[44104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvzengssahkvqgnlxcnqxpizojuhmllq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096016.3191748-981-217957917970323/AnsiballZ_systemd.py'
Nov 25 18:40:16 compute-0 sudo[44104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:40:17 compute-0 python3.9[44106]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:40:18 compute-0 systemd[1]: Reloading.
Nov 25 18:40:18 compute-0 systemd-rc-local-generator[44135]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:40:18 compute-0 sudo[44104]: pam_unix(sudo:session): session closed for user root
Nov 25 18:40:19 compute-0 sudo[44292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tshmyrwnugtreuamhfxtfsaqslehynzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096018.6906946-981-35795410360025/AnsiballZ_systemd.py'
Nov 25 18:40:19 compute-0 sudo[44292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:40:19 compute-0 python3.9[44294]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:40:19 compute-0 systemd[1]: Reloading.
Nov 25 18:40:19 compute-0 systemd-rc-local-generator[44325]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:40:19 compute-0 sudo[44292]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:02 compute-0 sudo[44482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvonbntujickwtzvjikgkzllzozzrhds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096062.2119632-1013-223145904443907/AnsiballZ_command.py'
Nov 25 18:41:02 compute-0 sudo[44482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:02 compute-0 python3.9[44484]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:41:02 compute-0 sudo[44482]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:03 compute-0 sudo[44635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmjeisiwbgrigvssywhqpeyeoiolfiio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096063.6060345-1029-53707719924032/AnsiballZ_command.py'
Nov 25 18:41:03 compute-0 sudo[44635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:04 compute-0 python3.9[44637]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:41:04 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 25 18:41:04 compute-0 sudo[44635]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:04 compute-0 sudo[44788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwhloklpkwzppbrwvuoionxmqkqahoij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096064.5405014-1045-23692578329660/AnsiballZ_command.py'
Nov 25 18:41:04 compute-0 sudo[44788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:05 compute-0 python3.9[44790]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:41:06 compute-0 sudo[44788]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:07 compute-0 sudo[44950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihlnxqluyhzwyijxitzuacvcvlbevkgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096067.0769777-1061-268861499406311/AnsiballZ_command.py'
Nov 25 18:41:07 compute-0 sudo[44950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:07 compute-0 python3.9[44952]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:41:07 compute-0 sudo[44950]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:13 compute-0 sudo[45103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfzbznyjyjbeibwhzjiyfpgawgwsxefs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096073.371306-1077-178062241670293/AnsiballZ_systemd.py'
Nov 25 18:41:13 compute-0 sudo[45103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:14 compute-0 python3.9[45105]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:41:14 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 25 18:41:14 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Nov 25 18:41:14 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Nov 25 18:41:14 compute-0 systemd[1]: Starting Apply Kernel Variables...
Nov 25 18:41:14 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 25 18:41:14 compute-0 systemd[1]: Finished Apply Kernel Variables.
Nov 25 18:41:14 compute-0 sudo[45103]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:14 compute-0 sshd-session[31538]: Connection closed by 192.168.122.30 port 35140
Nov 25 18:41:14 compute-0 sshd-session[31535]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:41:14 compute-0 systemd-logind[820]: Session 10 logged out. Waiting for processes to exit.
Nov 25 18:41:14 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Nov 25 18:41:14 compute-0 systemd[1]: session-10.scope: Consumed 2min 22.477s CPU time.
Nov 25 18:41:14 compute-0 systemd-logind[820]: Removed session 10.
Nov 25 18:41:19 compute-0 sshd-session[45135]: Accepted publickey for zuul from 192.168.122.30 port 36692 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:41:20 compute-0 systemd-logind[820]: New session 11 of user zuul.
Nov 25 18:41:20 compute-0 systemd[1]: Started Session 11 of User zuul.
Nov 25 18:41:20 compute-0 sshd-session[45135]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:41:21 compute-0 python3.9[45288]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:41:22 compute-0 python3.9[45442]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:41:24 compute-0 sudo[45596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prlmavfcheussqmeqmmabsltmgtqfwyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096083.5309072-80-279428814164067/AnsiballZ_command.py'
Nov 25 18:41:24 compute-0 sudo[45596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:24 compute-0 python3.9[45598]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:41:24 compute-0 sudo[45596]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:25 compute-0 python3.9[45749]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:41:26 compute-0 sudo[45903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zubsbjghtlzkfyomlgawkfniyvbnxbxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096086.1260653-120-200443956322482/AnsiballZ_setup.py'
Nov 25 18:41:26 compute-0 sudo[45903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:26 compute-0 python3.9[45905]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:41:27 compute-0 sudo[45903]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:27 compute-0 sudo[45987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hseudijivlfxcsvbhsgbprcvvneefkzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096086.1260653-120-200443956322482/AnsiballZ_dnf.py'
Nov 25 18:41:27 compute-0 sudo[45987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:27 compute-0 python3.9[45989]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:41:29 compute-0 sudo[45987]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:30 compute-0 sudo[46140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdqksczouohksaechhnxcjujrmbrrxxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096089.6577368-144-263566679125551/AnsiballZ_setup.py'
Nov 25 18:41:30 compute-0 sudo[46140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:30 compute-0 python3.9[46142]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:41:30 compute-0 sudo[46140]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:31 compute-0 sudo[46311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlthbdldjjxnjqwpanyvcnegvqomtheq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096091.1966908-166-67708456496991/AnsiballZ_file.py'
Nov 25 18:41:31 compute-0 sudo[46311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:31 compute-0 python3.9[46313]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:41:31 compute-0 sudo[46311]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:32 compute-0 sudo[46463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwthyapyxsdppnfkaottdvknvsiqrtze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096092.157296-182-252565825560614/AnsiballZ_command.py'
Nov 25 18:41:32 compute-0 sudo[46463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:32 compute-0 python3.9[46465]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:41:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3453278413-merged.mount: Deactivated successfully.
Nov 25 18:41:32 compute-0 podman[46466]: 2025-11-25 18:41:32.855921242 +0000 UTC m=+0.079907623 system refresh
Nov 25 18:41:32 compute-0 sudo[46463]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:41:35 compute-0 sudo[46626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejznvbcuwkdxtomceklivszcwcjhcwqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096095.2949162-198-94482271991962/AnsiballZ_stat.py'
Nov 25 18:41:35 compute-0 sudo[46626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:36 compute-0 python3.9[46628]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:41:36 compute-0 sudo[46626]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:36 compute-0 sudo[46749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khwqkdxqmuwcnueozdeflhymzmfmotfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096095.2949162-198-94482271991962/AnsiballZ_copy.py'
Nov 25 18:41:36 compute-0 sudo[46749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:36 compute-0 python3.9[46751]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096095.2949162-198-94482271991962/.source.json follow=False _original_basename=podman_network_config.j2 checksum=66a4dd97f4ff1f5da9355c99baf891eca0771ed0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:41:36 compute-0 sudo[46749]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:37 compute-0 sudo[46901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srlhfxkkixnmzbdenampagwvwbzmgwnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096097.1469173-228-278992878115014/AnsiballZ_stat.py'
Nov 25 18:41:37 compute-0 sudo[46901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:37 compute-0 python3.9[46903]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:41:37 compute-0 sudo[46901]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:38 compute-0 sudo[47024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnmzyneydykjqwccdvurfgsirhcfguas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096097.1469173-228-278992878115014/AnsiballZ_copy.py'
Nov 25 18:41:38 compute-0 sudo[47024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:38 compute-0 python3.9[47026]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096097.1469173-228-278992878115014/.source.conf follow=False _original_basename=registries.conf.j2 checksum=51f7dfe021bf6a784cb4010cf142a3df219fb1a0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:41:38 compute-0 sudo[47024]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:39 compute-0 sudo[47176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fftmpyydsopxwztcnorgyrbmrbrjiwbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096098.802524-260-254823569987460/AnsiballZ_ini_file.py'
Nov 25 18:41:39 compute-0 sudo[47176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:39 compute-0 python3.9[47178]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:41:39 compute-0 sudo[47176]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:40 compute-0 sudo[47328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brfyqtyhorjicxfpjtwfkhhyyhmhhgrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096099.7853339-260-202396022255859/AnsiballZ_ini_file.py'
Nov 25 18:41:40 compute-0 sudo[47328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:40 compute-0 python3.9[47330]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:41:40 compute-0 sudo[47328]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:40 compute-0 sudo[47480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrruftfnjzbpnidurseozwiobwkzfvrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096100.5946276-260-205948596095820/AnsiballZ_ini_file.py'
Nov 25 18:41:40 compute-0 sudo[47480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:41 compute-0 python3.9[47482]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:41:41 compute-0 sudo[47480]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:41 compute-0 sudo[47632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fppqwbuxknurjlfixrnuxbmfosopahmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096101.4437943-260-87340515637783/AnsiballZ_ini_file.py'
Nov 25 18:41:41 compute-0 sudo[47632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:42 compute-0 python3.9[47634]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:41:42 compute-0 sudo[47632]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:43 compute-0 python3.9[47784]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:41:44 compute-0 sudo[47936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqqkceyutgonlxzqvheqnlummirytlii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096103.6692476-340-96781498928307/AnsiballZ_dnf.py'
Nov 25 18:41:44 compute-0 sudo[47936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:44 compute-0 python3.9[47938]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:41:45 compute-0 sudo[47936]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:46 compute-0 sudo[48089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzpnekdeupopposcdvtblwacjieccvix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096106.1696007-356-251509311059809/AnsiballZ_dnf.py'
Nov 25 18:41:46 compute-0 sudo[48089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:46 compute-0 python3.9[48091]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:41:49 compute-0 sudo[48089]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:49 compute-0 sudo[48249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvgixbsibjzkzwdvlsoiapbhhbpwsrsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096109.5953445-376-189085055011768/AnsiballZ_dnf.py'
Nov 25 18:41:49 compute-0 sudo[48249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:50 compute-0 python3.9[48251]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:41:51 compute-0 sudo[48249]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:52 compute-0 sudo[48402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqbldniahhrgyzphqnvfvqyjogrvhytp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096111.7173424-394-212522478910807/AnsiballZ_dnf.py'
Nov 25 18:41:52 compute-0 sudo[48402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:52 compute-0 python3.9[48404]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:41:53 compute-0 sudo[48402]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:54 compute-0 sudo[48555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruytxxytffhfbeljbaoxiqjlkautckec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096114.158148-416-202486299674655/AnsiballZ_dnf.py'
Nov 25 18:41:54 compute-0 sudo[48555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:54 compute-0 python3.9[48557]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:41:56 compute-0 sudo[48555]: pam_unix(sudo:session): session closed for user root
Nov 25 18:41:57 compute-0 sudo[48711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqqvgbktihbavsofprqllicfvbkjruoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096116.7929308-432-88970370304525/AnsiballZ_dnf.py'
Nov 25 18:41:57 compute-0 sudo[48711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:41:57 compute-0 python3.9[48713]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:42:00 compute-0 sudo[48711]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:01 compute-0 sudo[48880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsvlbvlzrnydenhsilbouplldtdrydar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096121.1809032-450-102726577152518/AnsiballZ_dnf.py'
Nov 25 18:42:01 compute-0 sudo[48880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:01 compute-0 python3.9[48882]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:42:03 compute-0 sudo[48880]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:03 compute-0 sudo[49033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amthqhiszxsqqwusffglixencyfqrxem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096123.3744433-468-270558471039614/AnsiballZ_dnf.py'
Nov 25 18:42:03 compute-0 sudo[49033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:04 compute-0 python3.9[49035]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:42:18 compute-0 sudo[49033]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:20 compute-0 sudo[49370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duyiahbvqlcyigvfpsfhcsfdwjajqhbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096140.465807-486-137417206035088/AnsiballZ_dnf.py'
Nov 25 18:42:20 compute-0 sudo[49370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:21 compute-0 python3.9[49372]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:42:22 compute-0 sudo[49370]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:23 compute-0 sudo[49526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvwjnuzmnbwgwordqepqnliqdgymourr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096142.8937743-508-55291294859191/AnsiballZ_file.py'
Nov 25 18:42:23 compute-0 sudo[49526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:23 compute-0 python3.9[49528]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:42:23 compute-0 sudo[49526]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:24 compute-0 sudo[49701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbanosvbwzrgotcshakuovwklfeoosmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096143.680081-524-257237871382467/AnsiballZ_stat.py'
Nov 25 18:42:24 compute-0 sudo[49701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:24 compute-0 python3.9[49703]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:42:24 compute-0 sudo[49701]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:24 compute-0 sudo[49824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vobchhaahhabjzfhtkeidzotoiqruekh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096143.680081-524-257237871382467/AnsiballZ_copy.py'
Nov 25 18:42:24 compute-0 sudo[49824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:25 compute-0 python3.9[49826]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764096143.680081-524-257237871382467/.source.json _original_basename=.4rqj8i23 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:42:25 compute-0 sudo[49824]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:26 compute-0 sudo[49976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcfeomvcqfdnobmakyjkmxfumczjmplo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096145.465256-560-118642253185579/AnsiballZ_podman_image.py'
Nov 25 18:42:26 compute-0 sudo[49976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:26 compute-0 python3.9[49978]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 18:42:26 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2835607213-lower\x2dmapped.mount: Deactivated successfully.
Nov 25 18:42:32 compute-0 podman[49989]: 2025-11-25 18:42:32.772031566 +0000 UTC m=+6.421984242 image pull 6a8194dc5cbc0b30fb087899a1cd17693ec8d18197c75e9f4cc0e4bdb35c6c1c 38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Nov 25 18:42:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:33 compute-0 sudo[49976]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:34 compute-0 sudo[50283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvyigxymkwuerysmbqjeutoswzxgdnue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096154.005536-582-258227995076132/AnsiballZ_podman_image.py'
Nov 25 18:42:34 compute-0 sudo[50283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:34 compute-0 python3.9[50285]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 18:42:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:43 compute-0 podman[50297]: 2025-11-25 18:42:43.43198488 +0000 UTC m=+8.698584324 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 18:42:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:43 compute-0 sudo[50283]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:44 compute-0 sudo[50596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjjnyfrbgzvksbmoqbixfkgippagmzch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096164.4899807-602-158040538131467/AnsiballZ_podman_image.py'
Nov 25 18:42:44 compute-0 sudo[50596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:45 compute-0 python3.9[50598]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 18:42:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:45 compute-0 podman[50611]: 2025-11-25 18:42:45.496878433 +0000 UTC m=+0.374894223 image pull 3b9623fd19bd3aa77b0b5fd336125d3125adff84d7957abb18fcf4bd44d404d6 38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Nov 25 18:42:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:45 compute-0 sudo[50596]: pam_unix(sudo:session): session closed for user root
Nov 25 18:42:46 compute-0 sudo[50847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypbobjflongqyshpeocjancvjsufcvsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096166.2863019-620-103355194262339/AnsiballZ_podman_image.py'
Nov 25 18:42:46 compute-0 sudo[50847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:42:46 compute-0 python3.9[50849]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 18:42:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:57 compute-0 podman[50861]: 2025-11-25 18:42:57.170084407 +0000 UTC m=+10.218363930 image pull bbd9e65c99fb428dc4f8c73808d764a75903488c747752b60e55265221d7aeb4 38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Nov 25 18:42:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:42:57 compute-0 sudo[50847]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:00 compute-0 sudo[51117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfyzwreiutwtedunvgdbpcvbdmtmnlxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096179.777481-642-144223696293792/AnsiballZ_podman_image.py'
Nov 25 18:43:00 compute-0 sudo[51117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:00 compute-0 python3.9[51119]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.27:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 18:43:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:43:03 compute-0 podman[51131]: 2025-11-25 18:43:03.450830776 +0000 UTC m=+3.007516061 image pull cd70342b3c5d9a446126ca76e4f0a417f21a4928ec055108b55b0547b78128d1 38.102.83.27:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Nov 25 18:43:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:43:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:43:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:43:03 compute-0 sudo[51117]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:04 compute-0 sudo[51387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiythzltraarrxfgwlyehxrcjidowbsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096183.955596-642-249382424410705/AnsiballZ_podman_image.py'
Nov 25 18:43:04 compute-0 sudo[51387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:04 compute-0 python3.9[51389]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 25 18:43:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:43:05 compute-0 podman[51401]: 2025-11-25 18:43:05.753141062 +0000 UTC m=+1.165194056 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 25 18:43:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:43:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:43:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:43:06 compute-0 sudo[51387]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:07 compute-0 sshd-session[45138]: Connection closed by 192.168.122.30 port 36692
Nov 25 18:43:07 compute-0 sshd-session[45135]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:43:07 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Nov 25 18:43:07 compute-0 systemd[1]: session-11.scope: Consumed 2min 1.430s CPU time.
Nov 25 18:43:07 compute-0 systemd-logind[820]: Session 11 logged out. Waiting for processes to exit.
Nov 25 18:43:07 compute-0 systemd-logind[820]: Removed session 11.
Nov 25 18:43:12 compute-0 sshd-session[51548]: Accepted publickey for zuul from 192.168.122.30 port 47132 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:43:12 compute-0 systemd-logind[820]: New session 12 of user zuul.
Nov 25 18:43:12 compute-0 systemd[1]: Started Session 12 of User zuul.
Nov 25 18:43:12 compute-0 sshd-session[51548]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:43:14 compute-0 python3.9[51701]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:43:15 compute-0 sudo[51855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heqemcswfladscxxbbojtzaeqifjuefl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096194.7737637-52-37658671186639/AnsiballZ_getent.py'
Nov 25 18:43:15 compute-0 sudo[51855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:15 compute-0 python3.9[51857]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 25 18:43:15 compute-0 sudo[51855]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:16 compute-0 sudo[52008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfhncdtmrcorlgpochciemlaxvvlpwwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096195.8509831-68-119713538549129/AnsiballZ_group.py'
Nov 25 18:43:16 compute-0 sudo[52008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:16 compute-0 python3.9[52010]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 18:43:16 compute-0 groupadd[52011]: group added to /etc/group: name=openvswitch, GID=42476
Nov 25 18:43:16 compute-0 groupadd[52011]: group added to /etc/gshadow: name=openvswitch
Nov 25 18:43:16 compute-0 groupadd[52011]: new group: name=openvswitch, GID=42476
Nov 25 18:43:16 compute-0 sudo[52008]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:17 compute-0 sudo[52166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjblnflarnrwkjpbskgvhmmktbxumhxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096196.8998253-84-224836898156944/AnsiballZ_user.py'
Nov 25 18:43:17 compute-0 sudo[52166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:17 compute-0 python3.9[52168]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 18:43:17 compute-0 useradd[52170]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 25 18:43:17 compute-0 useradd[52170]: add 'openvswitch' to group 'hugetlbfs'
Nov 25 18:43:17 compute-0 useradd[52170]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 25 18:43:17 compute-0 sudo[52166]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:18 compute-0 sudo[52326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cthkddotszpccggkjqlmffxkhmfhkrwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096198.1027079-104-104356040621312/AnsiballZ_setup.py'
Nov 25 18:43:18 compute-0 sudo[52326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:18 compute-0 python3.9[52328]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:43:19 compute-0 sudo[52326]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:19 compute-0 sudo[52410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lacuhlnuycmjwuirxgoicvxpauvtuyhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096198.1027079-104-104356040621312/AnsiballZ_dnf.py'
Nov 25 18:43:19 compute-0 sudo[52410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:19 compute-0 python3.9[52412]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:43:21 compute-0 sudo[52410]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:22 compute-0 sudo[52571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsnjtgpgrdauyindirvmthejwpvmgghv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096201.849789-132-187459968866614/AnsiballZ_dnf.py'
Nov 25 18:43:22 compute-0 sudo[52571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:22 compute-0 python3.9[52573]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:43:33 compute-0 kernel: SELinux:  Converting 2733 SID table entries...
Nov 25 18:43:33 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:43:33 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 18:43:33 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:43:33 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:43:33 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:43:33 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:43:33 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:43:33 compute-0 groupadd[52605]: group added to /etc/group: name=unbound, GID=993
Nov 25 18:43:33 compute-0 groupadd[52605]: group added to /etc/gshadow: name=unbound
Nov 25 18:43:33 compute-0 groupadd[52605]: new group: name=unbound, GID=993
Nov 25 18:43:33 compute-0 useradd[52612]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 25 18:43:33 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 25 18:43:33 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 25 18:43:35 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 18:43:35 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 18:43:35 compute-0 systemd[1]: Reloading.
Nov 25 18:43:35 compute-0 systemd-rc-local-generator[53106]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:43:35 compute-0 systemd-sysv-generator[53113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:43:35 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 18:43:36 compute-0 sudo[52571]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:36 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 18:43:36 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 18:43:36 compute-0 systemd[1]: run-r8666f8c2fc884213a6b1b928a69f7566.service: Deactivated successfully.
Nov 25 18:43:38 compute-0 sudo[53678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzvpgbxcusctyszhoicwwijmxzpagjuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096217.7756302-148-175664305778455/AnsiballZ_systemd.py'
Nov 25 18:43:38 compute-0 sudo[53678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:38 compute-0 python3.9[53680]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 18:43:38 compute-0 systemd[1]: Reloading.
Nov 25 18:43:38 compute-0 systemd-rc-local-generator[53707]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:43:38 compute-0 systemd-sysv-generator[53713]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:43:39 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Nov 25 18:43:39 compute-0 chown[53723]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 25 18:43:39 compute-0 ovs-ctl[53728]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 25 18:43:39 compute-0 ovs-ctl[53728]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 25 18:43:39 compute-0 ovs-ctl[53728]: Starting ovsdb-server [  OK  ]
Nov 25 18:43:39 compute-0 ovs-vsctl[53777]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 25 18:43:39 compute-0 ovs-vsctl[53793]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"942ca545-427a-4223-ba58-570f588d0469\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 25 18:43:39 compute-0 ovs-ctl[53728]: Configuring Open vSwitch system IDs [  OK  ]
Nov 25 18:43:39 compute-0 ovs-vsctl[53800]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 25 18:43:39 compute-0 ovs-ctl[53728]: Enabling remote OVSDB managers [  OK  ]
Nov 25 18:43:39 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Nov 25 18:43:39 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 25 18:43:39 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 25 18:43:39 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 25 18:43:39 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Nov 25 18:43:39 compute-0 ovs-ctl[53847]: Inserting openvswitch module [  OK  ]
Nov 25 18:43:39 compute-0 ovs-ctl[53816]: Starting ovs-vswitchd [  OK  ]
Nov 25 18:43:39 compute-0 ovs-vsctl[53864]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 25 18:43:39 compute-0 ovs-ctl[53816]: Enabling remote OVSDB managers [  OK  ]
Nov 25 18:43:39 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 25 18:43:39 compute-0 systemd[1]: Starting Open vSwitch...
Nov 25 18:43:39 compute-0 systemd[1]: Finished Open vSwitch.
Nov 25 18:43:40 compute-0 sudo[53678]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:41 compute-0 python3.9[54016]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:43:41 compute-0 sudo[54166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehvzrinyrwctvrtljgvhwbwvekaajltp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096221.277562-184-156514752173298/AnsiballZ_sefcontext.py'
Nov 25 18:43:41 compute-0 sudo[54166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:42 compute-0 python3.9[54168]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 25 18:43:43 compute-0 kernel: SELinux:  Converting 2747 SID table entries...
Nov 25 18:43:43 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:43:43 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 18:43:43 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:43:43 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:43:43 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:43:43 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:43:43 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:43:43 compute-0 sudo[54166]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:45 compute-0 python3.9[54323]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:43:46 compute-0 sudo[54479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwvvhoplpsmfvivqpmclmtmoueuyhhfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096226.3686528-220-103121204274949/AnsiballZ_dnf.py'
Nov 25 18:43:46 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 25 18:43:46 compute-0 sudo[54479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:47 compute-0 python3.9[54481]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:43:48 compute-0 sudo[54479]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:49 compute-0 sudo[54632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxygdqoqihlosrkxqlyotihngpieybrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096228.5842516-236-274566902006963/AnsiballZ_command.py'
Nov 25 18:43:49 compute-0 sudo[54632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:49 compute-0 python3.9[54634]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:43:50 compute-0 sudo[54632]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:50 compute-0 sudo[54919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojlnqeydlidikvqqwtlqrpwgashuqhju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096230.3027098-252-228829709782269/AnsiballZ_file.py'
Nov 25 18:43:50 compute-0 sudo[54919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:51 compute-0 python3.9[54921]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 25 18:43:51 compute-0 sudo[54919]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:52 compute-0 python3.9[55071]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:43:52 compute-0 sudo[55223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebelmmbrzaetbknqpvsnneqaamuzvbal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096232.2697113-284-1132009323780/AnsiballZ_dnf.py'
Nov 25 18:43:52 compute-0 sudo[55223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:52 compute-0 python3.9[55225]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:43:54 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 18:43:54 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 18:43:54 compute-0 systemd[1]: Reloading.
Nov 25 18:43:54 compute-0 systemd-rc-local-generator[55263]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:43:54 compute-0 systemd-sysv-generator[55266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:43:55 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 18:43:55 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 18:43:55 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 18:43:55 compute-0 systemd[1]: run-r90015ca8e19a42b6bf27a7859cfe14b7.service: Deactivated successfully.
Nov 25 18:43:55 compute-0 sudo[55223]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:56 compute-0 sudo[55540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pumfiuwdarryrinmbawswxapqlehhymf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096235.7454836-300-88791527527970/AnsiballZ_systemd.py'
Nov 25 18:43:56 compute-0 sudo[55540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:56 compute-0 python3.9[55542]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:43:56 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 25 18:43:56 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Nov 25 18:43:56 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Nov 25 18:43:56 compute-0 systemd[1]: Stopping Network Manager...
Nov 25 18:43:56 compute-0 NetworkManager[7193]: <info>  [1764096236.4116] caught SIGTERM, shutting down normally.
Nov 25 18:43:56 compute-0 NetworkManager[7193]: <info>  [1764096236.4131] dhcp4 (eth0): canceled DHCP transaction
Nov 25 18:43:56 compute-0 NetworkManager[7193]: <info>  [1764096236.4132] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 18:43:56 compute-0 NetworkManager[7193]: <info>  [1764096236.4132] dhcp4 (eth0): state changed no lease
Nov 25 18:43:56 compute-0 NetworkManager[7193]: <info>  [1764096236.4134] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 18:43:56 compute-0 NetworkManager[7193]: <info>  [1764096236.4203] exiting (success)
Nov 25 18:43:56 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 18:43:56 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 18:43:56 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 25 18:43:56 compute-0 systemd[1]: Stopped Network Manager.
Nov 25 18:43:56 compute-0 systemd[1]: NetworkManager.service: Consumed 12.950s CPU time, 4.1M memory peak, read 0B from disk, written 34.0K to disk.
Nov 25 18:43:56 compute-0 systemd[1]: Starting Network Manager...
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.4871] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:5314dbd0-f0d3-4d8c-818c-96beee19bec6)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.4872] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.4922] manager[0x557857452090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 25 18:43:56 compute-0 systemd[1]: Starting Hostname Service...
Nov 25 18:43:56 compute-0 systemd[1]: Started Hostname Service.
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6018] hostname: hostname: using hostnamed
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6019] hostname: static hostname changed from (none) to "compute-0"
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6022] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6026] manager[0x557857452090]: rfkill: Wi-Fi hardware radio set enabled
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6027] manager[0x557857452090]: rfkill: WWAN hardware radio set enabled
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6043] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6051] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6051] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6052] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6052] manager: Networking is enabled by state file
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6054] settings: Loaded settings plugin: keyfile (internal)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6057] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6075] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6082] dhcp: init: Using DHCP client 'internal'
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6084] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6087] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6092] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6097] device (lo): Activation: starting connection 'lo' (7ebbea74-d1bb-4fb2-acd3-42edf212bfe7)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6102] device (eth0): carrier: link connected
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6105] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6108] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6108] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6112] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6117] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6121] device (eth1): carrier: link connected
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6124] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6128] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (71bc06b1-39da-5e71-b6b2-29261e1233ba) (indicated)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6128] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6131] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6136] device (eth1): Activation: starting connection 'ci-private-network' (71bc06b1-39da-5e71-b6b2-29261e1233ba)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6141] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 25 18:43:56 compute-0 systemd[1]: Started Network Manager.
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6145] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6147] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6148] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6150] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6152] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6153] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6155] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6157] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6161] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6162] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6187] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6208] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6219] dhcp4 (eth0): state changed new lease, address=38.102.83.177
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6227] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6290] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6294] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6300] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6306] device (lo): Activation: successful, device activated.
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6313] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6315] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6318] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6321] device (eth1): Activation: successful, device activated.
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6329] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6332] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6336] manager: NetworkManager state is now CONNECTED_SITE
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6339] device (eth0): Activation: successful, device activated.
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6344] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 25 18:43:56 compute-0 NetworkManager[55552]: <info>  [1764096236.6347] manager: startup complete
Nov 25 18:43:56 compute-0 systemd[1]: Starting Network Manager Wait Online...
Nov 25 18:43:56 compute-0 sudo[55540]: pam_unix(sudo:session): session closed for user root
Nov 25 18:43:56 compute-0 systemd[1]: Finished Network Manager Wait Online.
Nov 25 18:43:57 compute-0 sudo[55766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdmebpaopddimndkmgeqkogfqbzqjbla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096237.211213-316-94841861665297/AnsiballZ_dnf.py'
Nov 25 18:43:57 compute-0 sudo[55766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:43:57 compute-0 python3.9[55768]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:44:02 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 18:44:02 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 18:44:02 compute-0 systemd[1]: Reloading.
Nov 25 18:44:02 compute-0 systemd-sysv-generator[55824]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:44:02 compute-0 systemd-rc-local-generator[55817]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:44:03 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 18:44:04 compute-0 sudo[55766]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:04 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 18:44:04 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 18:44:04 compute-0 systemd[1]: run-r2c28d45ea42743bab6a0f0d0f6ba4371.service: Deactivated successfully.
Nov 25 18:44:05 compute-0 sudo[56224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycpjdzzzqemcljevrwmfizftvgfeqfgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096245.1251113-340-32825969935392/AnsiballZ_stat.py'
Nov 25 18:44:05 compute-0 sudo[56224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:05 compute-0 python3.9[56226]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:44:05 compute-0 sudo[56224]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:06 compute-0 sudo[56376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dakqrczpbssamibvpcfpzyryljnpvsqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096245.966426-358-161104021429138/AnsiballZ_ini_file.py'
Nov 25 18:44:06 compute-0 sudo[56376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:06 compute-0 python3.9[56378]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:06 compute-0 sudo[56376]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:06 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 18:44:07 compute-0 sudo[56530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgognzlwtjzxrsbhxmlmyfajqiyjxbzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096247.0519-378-185705818386663/AnsiballZ_ini_file.py'
Nov 25 18:44:07 compute-0 sudo[56530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:07 compute-0 python3.9[56532]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:07 compute-0 sudo[56530]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:08 compute-0 sudo[56682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evhfddxpwccuxidtdsntzeiqwskednfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096247.8318267-378-62439633935403/AnsiballZ_ini_file.py'
Nov 25 18:44:08 compute-0 sudo[56682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:08 compute-0 python3.9[56684]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:08 compute-0 sudo[56682]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:09 compute-0 sudo[56834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbjxwhfcbajqqpuoaannuephohkplffp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096248.6750178-408-5447148489927/AnsiballZ_ini_file.py'
Nov 25 18:44:09 compute-0 sudo[56834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:09 compute-0 python3.9[56836]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:09 compute-0 sudo[56834]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:09 compute-0 sudo[56986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzswmaylnytirrbpdiumqwvgbhdzdurx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096249.4656289-408-123675424831316/AnsiballZ_ini_file.py'
Nov 25 18:44:09 compute-0 sudo[56986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:10 compute-0 python3.9[56988]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:10 compute-0 sudo[56986]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:10 compute-0 sudo[57138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lisesuckilergvwezsqtmmrwsnfgtitg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096250.3392093-438-202323139205971/AnsiballZ_stat.py'
Nov 25 18:44:10 compute-0 sudo[57138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:10 compute-0 python3.9[57140]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:44:10 compute-0 sudo[57138]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:11 compute-0 sudo[57261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qryowgjyywuftnljdgjvduqshqbqnizg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096250.3392093-438-202323139205971/AnsiballZ_copy.py'
Nov 25 18:44:11 compute-0 sudo[57261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:11 compute-0 python3.9[57263]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096250.3392093-438-202323139205971/.source _original_basename=.f41_gzh6 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:11 compute-0 sudo[57261]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:12 compute-0 sudo[57413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsiunalysuxkrzcnalqzoamgkvssions ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096251.9333456-468-188698072957404/AnsiballZ_file.py'
Nov 25 18:44:12 compute-0 sudo[57413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:12 compute-0 python3.9[57415]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:12 compute-0 sudo[57413]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:13 compute-0 sudo[57565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdbgloeqixvnjqbxdnccqnosnvbxpxrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096252.7015982-484-58292063974176/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 25 18:44:13 compute-0 sudo[57565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:13 compute-0 python3.9[57567]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 25 18:44:13 compute-0 sudo[57565]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:14 compute-0 sudo[57717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osreucghayhihcgxzcgqymowfoeeltwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096253.7229764-502-106218801745291/AnsiballZ_file.py'
Nov 25 18:44:14 compute-0 sudo[57717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:14 compute-0 python3.9[57719]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:14 compute-0 sudo[57717]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:15 compute-0 sudo[57869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzszlmrxgsdqyqrjwryeqsbutcdpuzqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096254.7111964-522-1988734942403/AnsiballZ_stat.py'
Nov 25 18:44:15 compute-0 sudo[57869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:15 compute-0 sudo[57869]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:15 compute-0 sudo[57992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inylqfmzrgtkxokjrgdrbvnrocptgekx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096254.7111964-522-1988734942403/AnsiballZ_copy.py'
Nov 25 18:44:15 compute-0 sudo[57992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:15 compute-0 sudo[57992]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:16 compute-0 sudo[58144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aknepgfqvpapnhkcxdrkvpzgjtrkvzda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096256.1766565-552-195188951793434/AnsiballZ_slurp.py'
Nov 25 18:44:16 compute-0 sudo[58144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:16 compute-0 python3.9[58146]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 25 18:44:16 compute-0 sudo[58144]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:18 compute-0 sudo[58319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwmaecrvbguajltvzsrodoxikgqamrpn ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096257.1948092-570-18275172575026/async_wrapper.py j895393867409 300 /home/zuul/.ansible/tmp/ansible-tmp-1764096257.1948092-570-18275172575026/AnsiballZ_edpm_os_net_config.py _'
Nov 25 18:44:18 compute-0 sudo[58319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:18 compute-0 ansible-async_wrapper.py[58321]: Invoked with j895393867409 300 /home/zuul/.ansible/tmp/ansible-tmp-1764096257.1948092-570-18275172575026/AnsiballZ_edpm_os_net_config.py _
Nov 25 18:44:18 compute-0 ansible-async_wrapper.py[58324]: Starting module and watcher
Nov 25 18:44:18 compute-0 ansible-async_wrapper.py[58324]: Start watching 58325 (300)
Nov 25 18:44:18 compute-0 ansible-async_wrapper.py[58325]: Start module (58325)
Nov 25 18:44:18 compute-0 ansible-async_wrapper.py[58321]: Return async_wrapper task started.
Nov 25 18:44:18 compute-0 sudo[58319]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:18 compute-0 python3.9[58326]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 25 18:44:19 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 25 18:44:19 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 25 18:44:19 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 25 18:44:19 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 25 18:44:19 compute-0 kernel: cfg80211: failed to load regulatory.db
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.4807] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.4828] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5333] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5333] audit: op="connection-add" uuid="ccaebace-4506-4b1b-8614-fe59717db47f" name="br-ex-br" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5346] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5347] audit: op="connection-add" uuid="6a430eeb-0442-4c4a-a319-984287d78ca0" name="br-ex-port" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5359] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5360] audit: op="connection-add" uuid="6ca9da72-fc16-41c0-9f96-332adbf1f24d" name="eth1-port" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5371] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5372] audit: op="connection-add" uuid="b3aff92e-b157-4a59-8f17-d5a509cf97d6" name="vlan20-port" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5382] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5383] audit: op="connection-add" uuid="05b6511b-d81d-4a93-97d0-5c76b052c2ff" name="vlan21-port" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5394] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5395] audit: op="connection-add" uuid="d9e28004-0e06-4d3c-905b-3f9795d04b2d" name="vlan22-port" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5413] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.dhcp-timeout,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5429] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5430] audit: op="connection-add" uuid="25e29bb1-49ef-4d87-99bd-d4e4a51941f3" name="br-ex-if" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5500] audit: op="connection-update" uuid="71bc06b1-39da-5e71-b6b2-29261e1233ba" name="ci-private-network" args="connection.timestamp,connection.port-type,connection.master,connection.controller,connection.slave-type,ipv4.dns,ipv4.addresses,ipv4.method,ipv4.routes,ipv4.routing-rules,ipv4.never-default,ipv6.dns,ipv6.addresses,ipv6.addr-gen-mode,ipv6.routes,ipv6.routing-rules,ipv6.method,ovs-interface.type,ovs-external-ids.data" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5515] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5517] audit: op="connection-add" uuid="cc648036-2cb5-4683-ab00-23846e427585" name="vlan20-if" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5531] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5532] audit: op="connection-add" uuid="93f01a98-ed91-402a-9e2d-42efe50dfe1c" name="vlan21-if" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5547] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5548] audit: op="connection-add" uuid="732d9060-cca5-4e87-a8d0-3d36731ce6c0" name="vlan22-if" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5562] audit: op="connection-delete" uuid="52d4efbe-1a37-380b-ae77-c8d713488d53" name="Wired connection 1" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5573] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5581] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5585] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (ccaebace-4506-4b1b-8614-fe59717db47f)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5585] audit: op="connection-activate" uuid="ccaebace-4506-4b1b-8614-fe59717db47f" name="br-ex-br" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5587] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5593] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5597] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (6a430eeb-0442-4c4a-a319-984287d78ca0)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5599] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5605] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5608] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (6ca9da72-fc16-41c0-9f96-332adbf1f24d)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5609] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5616] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5619] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (b3aff92e-b157-4a59-8f17-d5a509cf97d6)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5621] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5627] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5631] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (05b6511b-d81d-4a93-97d0-5c76b052c2ff)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5633] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5639] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5642] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (d9e28004-0e06-4d3c-905b-3f9795d04b2d)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5643] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5645] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5647] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5652] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5656] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5660] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (25e29bb1-49ef-4d87-99bd-d4e4a51941f3)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5660] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5663] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5665] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5666] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5667] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5676] device (eth1): disconnecting for new activation request.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5677] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5680] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5681] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5683] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5685] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5689] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5693] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (cc648036-2cb5-4683-ab00-23846e427585)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5694] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5697] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5698] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5700] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5702] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5706] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5710] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (93f01a98-ed91-402a-9e2d-42efe50dfe1c)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5711] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5714] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5716] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5717] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5720] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5724] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5728] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (732d9060-cca5-4e87-a8d0-3d36731ce6c0)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5728] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5731] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5733] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5735] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5736] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5746] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5748] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5752] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5754] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5759] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5765] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5768] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5771] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5772] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 kernel: ovs-system: entered promiscuous mode
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5790] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5795] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5798] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5800] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 kernel: Timeout policy base is empty
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5807] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 systemd-udevd[58333]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5811] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5816] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5818] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5824] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5829] dhcp4 (eth0): canceled DHCP transaction
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5829] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5829] dhcp4 (eth0): state changed no lease
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5832] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5842] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5846] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58327 uid=0 result="fail" reason="Device is not activated"
Nov 25 18:44:20 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5911] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5921] device (eth1): disconnecting for new activation request.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5921] audit: op="connection-activate" uuid="71bc06b1-39da-5e71-b6b2-29261e1233ba" name="ci-private-network" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5926] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5937] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5962] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Nov 25 18:44:20 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.5993] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 25 18:44:20 compute-0 kernel: br-ex: entered promiscuous mode
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6111] device (eth1): Activation: starting connection 'ci-private-network' (71bc06b1-39da-5e71-b6b2-29261e1233ba)
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6121] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6124] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6133] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6134] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6136] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6137] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6138] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6139] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6149] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6154] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6157] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6161] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6166] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6169] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6173] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6176] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6180] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6183] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6187] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6190] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6194] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6199] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6203] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6239] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6240] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6243] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 systemd-udevd[58331]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 18:44:20 compute-0 kernel: vlan22: entered promiscuous mode
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6251] device (eth1): Activation: successful, device activated.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6275] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6289] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6293] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6300] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 18:44:20 compute-0 kernel: vlan20: entered promiscuous mode
Nov 25 18:44:20 compute-0 kernel: vlan21: entered promiscuous mode
Nov 25 18:44:20 compute-0 systemd-udevd[58332]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 18:44:20 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6433] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6436] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6461] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6470] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6483] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6485] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6487] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6495] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6503] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6512] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6531] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6547] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6580] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6582] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 25 18:44:20 compute-0 NetworkManager[55552]: <info>  [1764096260.6591] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 25 18:44:21 compute-0 NetworkManager[55552]: <info>  [1764096261.6580] dhcp4 (eth0): state changed new lease, address=38.102.83.177
Nov 25 18:44:21 compute-0 NetworkManager[55552]: <info>  [1764096261.7661] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Nov 25 18:44:21 compute-0 NetworkManager[55552]: <info>  [1764096261.9452] checkpoint[0x557857429950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 25 18:44:21 compute-0 NetworkManager[55552]: <info>  [1764096261.9455] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58327 uid=0 result="success"
Nov 25 18:44:21 compute-0 sudo[58657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjtarfkadmqoryknohdmjyiblnlilufs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096261.4001503-570-263115288330387/AnsiballZ_async_status.py'
Nov 25 18:44:21 compute-0 sudo[58657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:22 compute-0 python3.9[58660]: ansible-ansible.legacy.async_status Invoked with jid=j895393867409.58321 mode=status _async_dir=/root/.ansible_async
Nov 25 18:44:22 compute-0 sudo[58657]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:22 compute-0 NetworkManager[55552]: <info>  [1764096262.1823] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58327 uid=0 result="success"
Nov 25 18:44:22 compute-0 NetworkManager[55552]: <info>  [1764096262.1833] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58327 uid=0 result="success"
Nov 25 18:44:22 compute-0 NetworkManager[55552]: <info>  [1764096262.4258] audit: op="networking-control" arg="global-dns-configuration" pid=58327 uid=0 result="success"
Nov 25 18:44:22 compute-0 NetworkManager[55552]: <info>  [1764096262.4298] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 25 18:44:22 compute-0 NetworkManager[55552]: <info>  [1764096262.4328] audit: op="networking-control" arg="global-dns-configuration" pid=58327 uid=0 result="success"
Nov 25 18:44:22 compute-0 NetworkManager[55552]: <info>  [1764096262.4346] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58327 uid=0 result="success"
Nov 25 18:44:22 compute-0 NetworkManager[55552]: <info>  [1764096262.6133] checkpoint[0x557857429a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 25 18:44:22 compute-0 NetworkManager[55552]: <info>  [1764096262.6136] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58327 uid=0 result="success"
Nov 25 18:44:22 compute-0 ansible-async_wrapper.py[58325]: Module complete (58325)
Nov 25 18:44:23 compute-0 ansible-async_wrapper.py[58324]: Done in kid B.
Nov 25 18:44:25 compute-0 sudo[58763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dibnmpruandzfgpyxlrmepnaxxjluhvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096261.4001503-570-263115288330387/AnsiballZ_async_status.py'
Nov 25 18:44:25 compute-0 sudo[58763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:25 compute-0 python3.9[58766]: ansible-ansible.legacy.async_status Invoked with jid=j895393867409.58321 mode=status _async_dir=/root/.ansible_async
Nov 25 18:44:25 compute-0 sudo[58763]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:25 compute-0 sudo[58863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykssctcchlcftzbrlgzxttqjqvfqzffr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096261.4001503-570-263115288330387/AnsiballZ_async_status.py'
Nov 25 18:44:25 compute-0 sudo[58863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:26 compute-0 python3.9[58865]: ansible-ansible.legacy.async_status Invoked with jid=j895393867409.58321 mode=cleanup _async_dir=/root/.ansible_async
Nov 25 18:44:26 compute-0 sudo[58863]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:26 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 25 18:44:26 compute-0 sudo[59017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnjolxcjzbliujhixwuhzluzaprggfkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096266.6068976-624-25107437009472/AnsiballZ_stat.py'
Nov 25 18:44:26 compute-0 sudo[59017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:27 compute-0 python3.9[59019]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:44:27 compute-0 sudo[59017]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:27 compute-0 sudo[59140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esngblntwguimradyxpdadwrmiadrmwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096266.6068976-624-25107437009472/AnsiballZ_copy.py'
Nov 25 18:44:27 compute-0 sudo[59140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:27 compute-0 python3.9[59142]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096266.6068976-624-25107437009472/.source.returncode _original_basename=.p859u31m follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:27 compute-0 sudo[59140]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:28 compute-0 sudo[59292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xocsymwlpczljffpggzghbwfpbnedece ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096268.261951-656-113199108270316/AnsiballZ_stat.py'
Nov 25 18:44:28 compute-0 sudo[59292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:28 compute-0 python3.9[59294]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:44:28 compute-0 sudo[59292]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:29 compute-0 sudo[59416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lymtnpomlswvniljiikwzjqkuuyagckj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096268.261951-656-113199108270316/AnsiballZ_copy.py'
Nov 25 18:44:29 compute-0 sudo[59416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:29 compute-0 python3.9[59418]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096268.261951-656-113199108270316/.source.cfg _original_basename=.ka21lwwn follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:29 compute-0 sudo[59416]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:30 compute-0 sudo[59568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdordccpjzljafebkfuhmfcgnjynybvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096269.7818973-686-81226453239987/AnsiballZ_systemd.py'
Nov 25 18:44:30 compute-0 sudo[59568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:30 compute-0 python3.9[59570]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:44:30 compute-0 systemd[1]: Reloading Network Manager...
Nov 25 18:44:30 compute-0 NetworkManager[55552]: <info>  [1764096270.5844] audit: op="reload" arg="0" pid=59574 uid=0 result="success"
Nov 25 18:44:30 compute-0 NetworkManager[55552]: <info>  [1764096270.5854] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 25 18:44:30 compute-0 systemd[1]: Reloaded Network Manager.
Nov 25 18:44:30 compute-0 sudo[59568]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:31 compute-0 sshd-session[51551]: Connection closed by 192.168.122.30 port 47132
Nov 25 18:44:31 compute-0 sshd-session[51548]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:44:31 compute-0 systemd-logind[820]: Session 12 logged out. Waiting for processes to exit.
Nov 25 18:44:31 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Nov 25 18:44:31 compute-0 systemd[1]: session-12.scope: Consumed 54.781s CPU time.
Nov 25 18:44:31 compute-0 systemd-logind[820]: Removed session 12.
Nov 25 18:44:36 compute-0 sshd-session[59606]: Accepted publickey for zuul from 192.168.122.30 port 52150 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:44:36 compute-0 systemd-logind[820]: New session 13 of user zuul.
Nov 25 18:44:36 compute-0 systemd[1]: Started Session 13 of User zuul.
Nov 25 18:44:36 compute-0 sshd-session[59606]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:44:38 compute-0 python3.9[59759]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:44:39 compute-0 python3.9[59914]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:44:40 compute-0 python3.9[60103]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:44:40 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 25 18:44:40 compute-0 sshd-session[59609]: Connection closed by 192.168.122.30 port 52150
Nov 25 18:44:40 compute-0 sshd-session[59606]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:44:40 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Nov 25 18:44:40 compute-0 systemd[1]: session-13.scope: Consumed 2.831s CPU time.
Nov 25 18:44:40 compute-0 systemd-logind[820]: Session 13 logged out. Waiting for processes to exit.
Nov 25 18:44:40 compute-0 systemd-logind[820]: Removed session 13.
Nov 25 18:44:46 compute-0 sshd-session[60132]: Accepted publickey for zuul from 192.168.122.30 port 51838 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:44:46 compute-0 systemd-logind[820]: New session 14 of user zuul.
Nov 25 18:44:46 compute-0 systemd[1]: Started Session 14 of User zuul.
Nov 25 18:44:46 compute-0 sshd-session[60132]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:44:47 compute-0 python3.9[60285]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:44:48 compute-0 python3.9[60440]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:44:49 compute-0 sudo[60594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvblhqlndccsebwvspylrtsmvfkjtmlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096289.080984-60-45935326205867/AnsiballZ_setup.py'
Nov 25 18:44:49 compute-0 sudo[60594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:49 compute-0 python3.9[60596]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:44:50 compute-0 sudo[60594]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:50 compute-0 sudo[60678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noblxybmwbhunflsajvplnsyfpocputc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096289.080984-60-45935326205867/AnsiballZ_dnf.py'
Nov 25 18:44:50 compute-0 sudo[60678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:50 compute-0 python3.9[60680]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:44:52 compute-0 sudo[60678]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:52 compute-0 sudo[60832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmaaaoidwkfbbkdkavtsunnkxsbiruph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096292.1916246-84-256457394995332/AnsiballZ_setup.py'
Nov 25 18:44:52 compute-0 sudo[60832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:52 compute-0 python3.9[60834]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:44:53 compute-0 sudo[60832]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:54 compute-0 sudo[61023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccuckroratiaqgstxdlojpwxrpjaobli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096293.5448034-106-263840265557880/AnsiballZ_file.py'
Nov 25 18:44:54 compute-0 sudo[61023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:54 compute-0 python3.9[61025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:54 compute-0 sudo[61023]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:55 compute-0 sudo[61175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmpdhzemuzxzzfbbxyeqffwlsczrchkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096294.6796305-122-201998993273625/AnsiballZ_command.py'
Nov 25 18:44:55 compute-0 sudo[61175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:55 compute-0 python3.9[61177]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:44:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:44:55 compute-0 sudo[61175]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:56 compute-0 sudo[61339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-badatvghlcfrbwnxafhbftxqrlzermaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096295.7273397-138-2213580013381/AnsiballZ_stat.py'
Nov 25 18:44:56 compute-0 sudo[61339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:56 compute-0 python3.9[61341]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:44:56 compute-0 sudo[61339]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:56 compute-0 sudo[61417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abyookpcnwtnxevfuklefpjzzhonssym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096295.7273397-138-2213580013381/AnsiballZ_file.py'
Nov 25 18:44:56 compute-0 sudo[61417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:56 compute-0 python3.9[61419]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:44:56 compute-0 sudo[61417]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:57 compute-0 sudo[61569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odtrvlyculfonsybmxzvfxlduptmzeyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096297.2062187-162-195658138311519/AnsiballZ_stat.py'
Nov 25 18:44:57 compute-0 sudo[61569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:57 compute-0 python3.9[61571]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:44:57 compute-0 sudo[61569]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:58 compute-0 sudo[61647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzioueikfmucgrlfvqobibuisechogog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096297.2062187-162-195658138311519/AnsiballZ_file.py'
Nov 25 18:44:58 compute-0 sudo[61647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:58 compute-0 python3.9[61649]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:44:58 compute-0 sudo[61647]: pam_unix(sudo:session): session closed for user root
Nov 25 18:44:59 compute-0 sudo[61799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hedlswruhgfhacnaxmcqhhmwiqgysmfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096298.8406234-188-26006573016913/AnsiballZ_ini_file.py'
Nov 25 18:44:59 compute-0 sudo[61799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:44:59 compute-0 python3.9[61801]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:44:59 compute-0 sudo[61799]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:00 compute-0 sudo[61951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cssvygnzuxztxckvobjtyynwvbiflkxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096299.7469344-188-18063043373407/AnsiballZ_ini_file.py'
Nov 25 18:45:00 compute-0 sudo[61951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:00 compute-0 python3.9[61953]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:45:00 compute-0 sudo[61951]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:00 compute-0 sudo[62103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irmfxzfuacvyfhujkjdtxdjfldotwqux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096300.4922938-188-169279535742917/AnsiballZ_ini_file.py'
Nov 25 18:45:00 compute-0 sudo[62103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:01 compute-0 python3.9[62105]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:45:01 compute-0 sudo[62103]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:01 compute-0 sudo[62255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbnlfkbdlducxezifswxbilucieoxqjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096301.2692893-188-215885036340968/AnsiballZ_ini_file.py'
Nov 25 18:45:01 compute-0 sudo[62255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:01 compute-0 python3.9[62257]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:45:01 compute-0 sudo[62255]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:02 compute-0 sudo[62407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-becyembxojxapgcyuzgidmwhdyoklixr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096302.2091076-250-255943311805482/AnsiballZ_dnf.py'
Nov 25 18:45:02 compute-0 sudo[62407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:02 compute-0 python3.9[62409]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:45:03 compute-0 sudo[62407]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:04 compute-0 sudo[62560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euyrykhirgdkoatrjuwotkqklpibzdcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096304.404316-272-31559054340894/AnsiballZ_setup.py'
Nov 25 18:45:04 compute-0 sudo[62560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:05 compute-0 python3.9[62562]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:45:05 compute-0 sudo[62560]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:05 compute-0 sudo[62714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otnodbxmfcgswccrdvrmyrqkvvmhoieh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096305.3360014-288-72740271485083/AnsiballZ_stat.py'
Nov 25 18:45:05 compute-0 sudo[62714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:05 compute-0 python3.9[62716]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:45:05 compute-0 sudo[62714]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:06 compute-0 sudo[62866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vklsneegfgfxsksjgkmlnymukpzsqpua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096306.209305-306-27619876190534/AnsiballZ_stat.py'
Nov 25 18:45:06 compute-0 sudo[62866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:06 compute-0 python3.9[62868]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:45:06 compute-0 sudo[62866]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:07 compute-0 sudo[63018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pobsblgsswsfximnutqljcwndkxvwrzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096307.200764-326-119687002758357/AnsiballZ_command.py'
Nov 25 18:45:07 compute-0 sudo[63018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:07 compute-0 python3.9[63020]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:45:07 compute-0 sudo[63018]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:08 compute-0 sudo[63171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzzfnctcevumxpugniqhbhrruanzffag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096308.167334-346-108778497722559/AnsiballZ_service_facts.py'
Nov 25 18:45:08 compute-0 sudo[63171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:08 compute-0 python3.9[63173]: ansible-service_facts Invoked
Nov 25 18:45:08 compute-0 network[63190]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 18:45:08 compute-0 network[63191]: 'network-scripts' will be removed from distribution in near future.
Nov 25 18:45:08 compute-0 network[63192]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 18:45:12 compute-0 sudo[63171]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:13 compute-0 sudo[63475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaimccukxaadfmpskcwhynfoklowdkwr ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764096313.3544717-376-107028371931141/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764096313.3544717-376-107028371931141/args'
Nov 25 18:45:13 compute-0 sudo[63475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:13 compute-0 sudo[63475]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:14 compute-0 sudo[63642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frzyrrupuuxiiikfcbrbluceswrwiqpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096314.2537491-398-247485667163606/AnsiballZ_dnf.py'
Nov 25 18:45:14 compute-0 sudo[63642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:14 compute-0 python3.9[63644]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:45:15 compute-0 sudo[63642]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:17 compute-0 sudo[63795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vadyrejvtgpgaberzaieipejrkluuzcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096316.3918712-424-22959198932034/AnsiballZ_package_facts.py'
Nov 25 18:45:17 compute-0 sudo[63795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:17 compute-0 python3.9[63797]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 25 18:45:17 compute-0 sudo[63795]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:18 compute-0 sudo[63947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqetyvyxusunnfvdfwhyadvnmprqpgbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096318.3205245-444-191912467200220/AnsiballZ_stat.py'
Nov 25 18:45:18 compute-0 sudo[63947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:18 compute-0 python3.9[63949]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:18 compute-0 sudo[63947]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:19 compute-0 sudo[64072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkxrzmtythqbtlbstcpwohenhsudhuiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096318.3205245-444-191912467200220/AnsiballZ_copy.py'
Nov 25 18:45:19 compute-0 sudo[64072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:19 compute-0 python3.9[64074]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096318.3205245-444-191912467200220/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:19 compute-0 sudo[64072]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:20 compute-0 sudo[64227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwidzsrdbzyvoqjicmkbflpqmvijkicc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096320.0681028-474-147393017690289/AnsiballZ_stat.py'
Nov 25 18:45:20 compute-0 sudo[64227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:20 compute-0 python3.9[64229]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:20 compute-0 sudo[64227]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:21 compute-0 sudo[64352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozsjspjqumveapvhxawuzakrlfgsyrnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096320.0681028-474-147393017690289/AnsiballZ_copy.py'
Nov 25 18:45:21 compute-0 sudo[64352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:21 compute-0 python3.9[64354]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096320.0681028-474-147393017690289/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:21 compute-0 sudo[64352]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:22 compute-0 sudo[64506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbgavumqxubdgwxeblsuqboreruzmemx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096322.1330523-516-182769734548583/AnsiballZ_lineinfile.py'
Nov 25 18:45:22 compute-0 sudo[64506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:22 compute-0 python3.9[64508]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:22 compute-0 sudo[64506]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:24 compute-0 sudo[64661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpnkfucbxuspfoxcemlmezljbwpgqkdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096323.766688-546-7813168004993/AnsiballZ_setup.py'
Nov 25 18:45:24 compute-0 sudo[64661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:24 compute-0 python3.9[64663]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:45:24 compute-0 sudo[64661]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:25 compute-0 sudo[64745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhfehtuzghriwkajyrvyagyatcqxdeos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096323.766688-546-7813168004993/AnsiballZ_systemd.py'
Nov 25 18:45:25 compute-0 sudo[64745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:25 compute-0 python3.9[64747]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:45:25 compute-0 sudo[64745]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:26 compute-0 sudo[64899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgivjhovxtslreoflnrmwdvhhxktklux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096326.3969073-578-261055913750172/AnsiballZ_setup.py'
Nov 25 18:45:26 compute-0 sudo[64899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:27 compute-0 python3.9[64901]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:45:27 compute-0 sudo[64899]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:27 compute-0 sudo[64983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zybarnjnbgtxchdfpmvkdujitluigocd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096326.3969073-578-261055913750172/AnsiballZ_systemd.py'
Nov 25 18:45:27 compute-0 sudo[64983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:28 compute-0 python3.9[64985]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:45:28 compute-0 chronyd[833]: chronyd exiting
Nov 25 18:45:28 compute-0 systemd[1]: Stopping NTP client/server...
Nov 25 18:45:28 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Nov 25 18:45:28 compute-0 systemd[1]: Stopped NTP client/server.
Nov 25 18:45:28 compute-0 systemd[1]: Starting NTP client/server...
Nov 25 18:45:28 compute-0 chronyd[64994]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 25 18:45:28 compute-0 chronyd[64994]: Frequency -26.179 +/- 0.199 ppm read from /var/lib/chrony/drift
Nov 25 18:45:28 compute-0 chronyd[64994]: Loaded seccomp filter (level 2)
Nov 25 18:45:28 compute-0 systemd[1]: Started NTP client/server.
Nov 25 18:45:28 compute-0 sudo[64983]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:28 compute-0 sshd-session[60135]: Connection closed by 192.168.122.30 port 51838
Nov 25 18:45:28 compute-0 sshd-session[60132]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:45:28 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Nov 25 18:45:28 compute-0 systemd-logind[820]: Session 14 logged out. Waiting for processes to exit.
Nov 25 18:45:28 compute-0 systemd[1]: session-14.scope: Consumed 29.996s CPU time.
Nov 25 18:45:28 compute-0 systemd-logind[820]: Removed session 14.
Nov 25 18:45:34 compute-0 sshd-session[65020]: Accepted publickey for zuul from 192.168.122.30 port 33322 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:45:34 compute-0 systemd-logind[820]: New session 15 of user zuul.
Nov 25 18:45:34 compute-0 systemd[1]: Started Session 15 of User zuul.
Nov 25 18:45:34 compute-0 sshd-session[65020]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:45:35 compute-0 python3.9[65173]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:45:36 compute-0 sudo[65327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlleuupvuixglpgqnxgvkixiffexjdgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096336.354937-46-187298586945208/AnsiballZ_file.py'
Nov 25 18:45:36 compute-0 sudo[65327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:37 compute-0 python3.9[65329]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:37 compute-0 sudo[65327]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:37 compute-0 sudo[65502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdnmbyqvhemelpfbqxfvklvhircptbbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096337.3958855-62-146638502346192/AnsiballZ_stat.py'
Nov 25 18:45:37 compute-0 sudo[65502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:38 compute-0 python3.9[65504]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:38 compute-0 sudo[65502]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:38 compute-0 sudo[65580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvfquvzwigwpnvswmaowetlfcerirqzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096337.3958855-62-146638502346192/AnsiballZ_file.py'
Nov 25 18:45:38 compute-0 sudo[65580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:38 compute-0 python3.9[65582]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.1zyk9ysb recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:38 compute-0 sudo[65580]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:39 compute-0 sudo[65732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuvrnelsfilozwcossqfdmomdzglqvym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096339.240707-102-57076726915609/AnsiballZ_stat.py'
Nov 25 18:45:39 compute-0 sudo[65732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:39 compute-0 python3.9[65734]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:39 compute-0 sudo[65732]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:40 compute-0 sudo[65855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvydawgudxkqbmwvmiyegxebomiboeeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096339.240707-102-57076726915609/AnsiballZ_copy.py'
Nov 25 18:45:40 compute-0 sudo[65855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:40 compute-0 python3.9[65857]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096339.240707-102-57076726915609/.source _original_basename=.38zin530 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:40 compute-0 sudo[65855]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:41 compute-0 sudo[66007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thaemqnlyebtdkgzzggdxfxchvtxxjbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096340.9435933-134-82775148706416/AnsiballZ_file.py'
Nov 25 18:45:41 compute-0 sudo[66007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:41 compute-0 python3.9[66009]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:45:41 compute-0 sudo[66007]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:42 compute-0 sudo[66159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edvrvudhwxgnwsghsmrrowncuyhpsltq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096341.7467623-150-235841211497554/AnsiballZ_stat.py'
Nov 25 18:45:42 compute-0 sudo[66159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:42 compute-0 python3.9[66161]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:42 compute-0 sudo[66159]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:42 compute-0 sudo[66282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvihpdrqlriqoyvsctucysckubhqbcgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096341.7467623-150-235841211497554/AnsiballZ_copy.py'
Nov 25 18:45:42 compute-0 sudo[66282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:42 compute-0 python3.9[66284]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096341.7467623-150-235841211497554/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:45:43 compute-0 sudo[66282]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:43 compute-0 sudo[66434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyobqwptxtkhsezkstmsmtxcilhykhzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096343.1798868-150-137068967561819/AnsiballZ_stat.py'
Nov 25 18:45:43 compute-0 sudo[66434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:43 compute-0 python3.9[66436]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:43 compute-0 sudo[66434]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:44 compute-0 sudo[66557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpbwfhtirlrodsjwjyffylbluijfdcpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096343.1798868-150-137068967561819/AnsiballZ_copy.py'
Nov 25 18:45:44 compute-0 sudo[66557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:44 compute-0 python3.9[66559]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096343.1798868-150-137068967561819/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:45:44 compute-0 sudo[66557]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:45 compute-0 sudo[66709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlgnitodckulilkvbswdqlbkauzejxqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096344.6960716-208-159383061321461/AnsiballZ_file.py'
Nov 25 18:45:45 compute-0 sudo[66709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:45 compute-0 python3.9[66711]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:45 compute-0 sudo[66709]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:45 compute-0 sudo[66861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzizyopmrzlctujmjkwyizukpaxzjwck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096345.5457954-224-131266035028011/AnsiballZ_stat.py'
Nov 25 18:45:45 compute-0 sudo[66861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:46 compute-0 python3.9[66863]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:46 compute-0 sudo[66861]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:46 compute-0 sudo[66984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvuzuxhzvvgnpfaxujhskwggksjiulsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096345.5457954-224-131266035028011/AnsiballZ_copy.py'
Nov 25 18:45:46 compute-0 sudo[66984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:46 compute-0 python3.9[66986]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096345.5457954-224-131266035028011/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:46 compute-0 sudo[66984]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:47 compute-0 sudo[67136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcxjycdnpytinfrxzfxymtmrcexlcptp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096347.0265055-254-84719757477294/AnsiballZ_stat.py'
Nov 25 18:45:47 compute-0 sudo[67136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:47 compute-0 python3.9[67138]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:47 compute-0 sudo[67136]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:48 compute-0 sudo[67259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trjdurxtytimlhsejacgnnyiptkqdbwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096347.0265055-254-84719757477294/AnsiballZ_copy.py'
Nov 25 18:45:48 compute-0 sudo[67259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:48 compute-0 python3.9[67261]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096347.0265055-254-84719757477294/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:48 compute-0 sudo[67259]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:49 compute-0 sudo[67411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwjaflrlpjxyobjhlubbxhgyulqnmzdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096348.5428996-284-69406827739882/AnsiballZ_systemd.py'
Nov 25 18:45:49 compute-0 sudo[67411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:49 compute-0 python3.9[67413]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:45:49 compute-0 systemd[1]: Reloading.
Nov 25 18:45:49 compute-0 systemd-rc-local-generator[67442]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:45:49 compute-0 systemd-sysv-generator[67446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:45:49 compute-0 systemd[1]: Reloading.
Nov 25 18:45:49 compute-0 systemd-sysv-generator[67482]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:45:49 compute-0 systemd-rc-local-generator[67479]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:45:49 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Nov 25 18:45:49 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Nov 25 18:45:50 compute-0 sudo[67411]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:50 compute-0 sudo[67639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szlfngrqwugryzvhemvwcsxgmycsaxaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096350.297885-300-70716805421361/AnsiballZ_stat.py'
Nov 25 18:45:50 compute-0 sudo[67639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:50 compute-0 python3.9[67641]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:50 compute-0 sudo[67639]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:51 compute-0 sudo[67762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hybxwlwsxguzpbngwuwercyakajudvow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096350.297885-300-70716805421361/AnsiballZ_copy.py'
Nov 25 18:45:51 compute-0 sudo[67762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:51 compute-0 python3.9[67764]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096350.297885-300-70716805421361/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:51 compute-0 sudo[67762]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:52 compute-0 sudo[67914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddcqfvjqnotszpvwrqvsmxlhxfgucmoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096351.7332714-330-168836931306372/AnsiballZ_stat.py'
Nov 25 18:45:52 compute-0 sudo[67914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:52 compute-0 python3.9[67916]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:45:52 compute-0 sudo[67914]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:52 compute-0 sudo[68037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-natladcualyceceuoiujhncvbntyozfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096351.7332714-330-168836931306372/AnsiballZ_copy.py'
Nov 25 18:45:52 compute-0 sudo[68037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:52 compute-0 python3.9[68039]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096351.7332714-330-168836931306372/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:45:52 compute-0 sudo[68037]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:53 compute-0 sudo[68189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwydjhupxllwroppgasefmbopvqefgzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096353.3329997-360-94142741828554/AnsiballZ_systemd.py'
Nov 25 18:45:53 compute-0 sudo[68189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:45:54 compute-0 python3.9[68191]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:45:54 compute-0 systemd[1]: Reloading.
Nov 25 18:45:54 compute-0 systemd-rc-local-generator[68220]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:45:54 compute-0 systemd-sysv-generator[68223]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:45:54 compute-0 systemd[1]: Reloading.
Nov 25 18:45:54 compute-0 systemd-sysv-generator[68255]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:45:54 compute-0 systemd-rc-local-generator[68246]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:45:54 compute-0 systemd[1]: Starting Create netns directory...
Nov 25 18:45:54 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 18:45:54 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 18:45:54 compute-0 systemd[1]: Finished Create netns directory.
Nov 25 18:45:54 compute-0 sudo[68189]: pam_unix(sudo:session): session closed for user root
Nov 25 18:45:55 compute-0 python3.9[68417]: ansible-ansible.builtin.service_facts Invoked
Nov 25 18:45:55 compute-0 network[68434]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 18:45:55 compute-0 network[68435]: 'network-scripts' will be removed from distribution in near future.
Nov 25 18:45:55 compute-0 network[68436]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 18:46:00 compute-0 sudo[68696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khyireifokrbphtcbhvzecrnogngjpqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096359.690798-392-75385777061551/AnsiballZ_systemd.py'
Nov 25 18:46:00 compute-0 sudo[68696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:00 compute-0 python3.9[68698]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:46:00 compute-0 systemd[1]: Reloading.
Nov 25 18:46:00 compute-0 systemd-sysv-generator[68731]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:46:00 compute-0 systemd-rc-local-generator[68727]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:46:00 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 25 18:46:00 compute-0 iptables.init[68738]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 25 18:46:01 compute-0 iptables.init[68738]: iptables: Flushing firewall rules: [  OK  ]
Nov 25 18:46:01 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Nov 25 18:46:01 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 25 18:46:01 compute-0 sudo[68696]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:01 compute-0 sudo[68935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xehgwrvmxvwrwaonkakenxgyiwxicgqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096361.3387156-392-156106092500531/AnsiballZ_systemd.py'
Nov 25 18:46:01 compute-0 sudo[68935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:02 compute-0 python3.9[68937]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:46:02 compute-0 sudo[68935]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:02 compute-0 sudo[69089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yailldskcqllstavgbbqgqmretfrotun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096362.5243459-424-131117200644756/AnsiballZ_systemd.py'
Nov 25 18:46:02 compute-0 sudo[69089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:03 compute-0 python3.9[69091]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:46:03 compute-0 systemd[1]: Reloading.
Nov 25 18:46:03 compute-0 systemd-rc-local-generator[69120]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:46:03 compute-0 systemd-sysv-generator[69124]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:46:03 compute-0 systemd[1]: Starting Netfilter Tables...
Nov 25 18:46:03 compute-0 systemd[1]: Finished Netfilter Tables.
Nov 25 18:46:03 compute-0 sudo[69089]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:04 compute-0 sudo[69282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etoxkqpvoxlzvgtaemeuodcxbmmfsgjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096363.9468205-440-262040053509253/AnsiballZ_command.py'
Nov 25 18:46:04 compute-0 sudo[69282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:04 compute-0 python3.9[69284]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:46:04 compute-0 sudo[69282]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:05 compute-0 sudo[69435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eikjoypfsxjeefubxjrwwjcajmptqwoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096365.21882-468-211586664446824/AnsiballZ_stat.py'
Nov 25 18:46:05 compute-0 sudo[69435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:05 compute-0 python3.9[69437]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:05 compute-0 sudo[69435]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:06 compute-0 sudo[69560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyhkwejpkskvuqqkugdqezgfnaowqfuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096365.21882-468-211586664446824/AnsiballZ_copy.py'
Nov 25 18:46:06 compute-0 sudo[69560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:06 compute-0 python3.9[69562]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096365.21882-468-211586664446824/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:06 compute-0 sudo[69560]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:07 compute-0 sudo[69713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slxlcrpklexzkyrtqatstbtbkvapbqri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096366.7751868-498-129449340709270/AnsiballZ_systemd.py'
Nov 25 18:46:07 compute-0 sudo[69713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:07 compute-0 python3.9[69715]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:46:07 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Nov 25 18:46:07 compute-0 sshd[1009]: Received SIGHUP; restarting.
Nov 25 18:46:07 compute-0 sshd[1009]: Server listening on 0.0.0.0 port 22.
Nov 25 18:46:07 compute-0 sshd[1009]: Server listening on :: port 22.
Nov 25 18:46:07 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Nov 25 18:46:07 compute-0 sudo[69713]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:08 compute-0 sudo[69869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvvqtmpnasdzxqcqnuaxszzaswugrcvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096367.8577135-514-193002732940311/AnsiballZ_file.py'
Nov 25 18:46:08 compute-0 sudo[69869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:08 compute-0 python3.9[69871]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:08 compute-0 sudo[69869]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:09 compute-0 sudo[70021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxcvnpgvcblgklfvssimemnchhjotbmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096368.7298703-530-49567767959354/AnsiballZ_stat.py'
Nov 25 18:46:09 compute-0 sudo[70021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:09 compute-0 python3.9[70023]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:09 compute-0 sudo[70021]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:09 compute-0 sudo[70144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlfhwctsbtounojchczijgmuuplcsrgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096368.7298703-530-49567767959354/AnsiballZ_copy.py'
Nov 25 18:46:09 compute-0 sudo[70144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:10 compute-0 python3.9[70146]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096368.7298703-530-49567767959354/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:10 compute-0 sudo[70144]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:11 compute-0 sudo[70296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oglbedpvaocucpvnhtfqsdyjyslrkolr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096370.5335538-566-139660921361336/AnsiballZ_timezone.py'
Nov 25 18:46:11 compute-0 sudo[70296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:11 compute-0 python3.9[70298]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 25 18:46:11 compute-0 systemd[1]: Starting Time & Date Service...
Nov 25 18:46:11 compute-0 systemd[1]: Started Time & Date Service.
Nov 25 18:46:11 compute-0 sudo[70296]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:12 compute-0 sudo[70452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kugvroekloikswdeflpflwcuqkqbeivw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096371.7762303-584-192035926660581/AnsiballZ_file.py'
Nov 25 18:46:12 compute-0 sudo[70452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:12 compute-0 python3.9[70454]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:12 compute-0 sudo[70452]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:12 compute-0 sudo[70604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yboyrgyeasdcqkqqzfuvmckhxkvpwsxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096372.5726016-600-57368001083579/AnsiballZ_stat.py'
Nov 25 18:46:12 compute-0 sudo[70604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:13 compute-0 python3.9[70606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:13 compute-0 sudo[70604]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:13 compute-0 sudo[70727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjncxjtsqnpljsbsumuzomokwipnaycy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096372.5726016-600-57368001083579/AnsiballZ_copy.py'
Nov 25 18:46:13 compute-0 sudo[70727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:13 compute-0 python3.9[70729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096372.5726016-600-57368001083579/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:13 compute-0 sudo[70727]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:14 compute-0 sudo[70879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acellimghrzgpeqfibrqbwmfepltdwfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096374.0170474-630-46344443516633/AnsiballZ_stat.py'
Nov 25 18:46:14 compute-0 sudo[70879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:14 compute-0 python3.9[70881]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:14 compute-0 sudo[70879]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:15 compute-0 sudo[71002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smzwewlgxzrtmpbnrdjguyeduhvqoqph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096374.0170474-630-46344443516633/AnsiballZ_copy.py'
Nov 25 18:46:15 compute-0 sudo[71002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:15 compute-0 python3.9[71004]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096374.0170474-630-46344443516633/.source.yaml _original_basename=.ytpcjcrd follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:15 compute-0 sudo[71002]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:15 compute-0 sudo[71154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzgcfrxfcxkcqdrkfbbtttqskuwisijb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096375.5304506-660-242799776120041/AnsiballZ_stat.py'
Nov 25 18:46:15 compute-0 sudo[71154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:16 compute-0 python3.9[71156]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:16 compute-0 sudo[71154]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:16 compute-0 sudo[71277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuhmooswaxnhkpmtmdifozvhvyjwxtqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096375.5304506-660-242799776120041/AnsiballZ_copy.py'
Nov 25 18:46:16 compute-0 sudo[71277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:16 compute-0 python3.9[71279]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096375.5304506-660-242799776120041/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:16 compute-0 sudo[71277]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:17 compute-0 sudo[71429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jujjsewwefoxxqvonkeimuuthcgusvlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096377.0791023-690-205926287884554/AnsiballZ_command.py'
Nov 25 18:46:17 compute-0 sudo[71429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:17 compute-0 python3.9[71431]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:46:17 compute-0 sudo[71429]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:18 compute-0 sudo[71582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrkiuzkhqwmhuodjvgguhkdylcxckekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096377.9103343-706-250733554331118/AnsiballZ_command.py'
Nov 25 18:46:18 compute-0 sudo[71582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:18 compute-0 python3.9[71584]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:46:18 compute-0 sudo[71582]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:19 compute-0 sudo[71735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiugcnkeizkqiznxtcpejbozhywuscir ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764096378.7766767-722-154542851953501/AnsiballZ_edpm_nftables_from_files.py'
Nov 25 18:46:19 compute-0 sudo[71735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:19 compute-0 python3[71737]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 18:46:19 compute-0 sudo[71735]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:20 compute-0 sudo[71887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmolwaxjoexayocpcoiqkduubbtdbyvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096379.713789-738-11048807050971/AnsiballZ_stat.py'
Nov 25 18:46:20 compute-0 sudo[71887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:20 compute-0 python3.9[71889]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:20 compute-0 sudo[71887]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:20 compute-0 sudo[72010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pazmhrlyfwzxbjgtxcbpgbiyvtskghve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096379.713789-738-11048807050971/AnsiballZ_copy.py'
Nov 25 18:46:20 compute-0 sudo[72010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:21 compute-0 python3.9[72012]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096379.713789-738-11048807050971/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:21 compute-0 sudo[72010]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:21 compute-0 sudo[72162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ichcftbmzpyzzgjimnxuhqffpiaiuwbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096381.3563094-768-147753447609599/AnsiballZ_stat.py'
Nov 25 18:46:21 compute-0 sudo[72162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:21 compute-0 python3.9[72164]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:21 compute-0 sudo[72162]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:22 compute-0 sudo[72285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axpcqlwtgjmyncncilvsdwzyijcwqgwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096381.3563094-768-147753447609599/AnsiballZ_copy.py'
Nov 25 18:46:22 compute-0 sudo[72285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:22 compute-0 python3.9[72287]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096381.3563094-768-147753447609599/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:22 compute-0 sudo[72285]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:23 compute-0 sudo[72437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmrufmnajtvscdkhbejxxyandpmmimwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096382.8771446-798-268851906800442/AnsiballZ_stat.py'
Nov 25 18:46:23 compute-0 sudo[72437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:23 compute-0 python3.9[72439]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:23 compute-0 sudo[72437]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:23 compute-0 sudo[72560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwfrtaakwexjscsnwbcjzimsysyjtjot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096382.8771446-798-268851906800442/AnsiballZ_copy.py'
Nov 25 18:46:24 compute-0 sudo[72560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:24 compute-0 python3.9[72562]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096382.8771446-798-268851906800442/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:24 compute-0 sudo[72560]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:24 compute-0 sudo[72712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyzvdzroiftfpnoffrlixmuhclcobgfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096384.454946-828-199876182264296/AnsiballZ_stat.py'
Nov 25 18:46:24 compute-0 sudo[72712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:25 compute-0 python3.9[72714]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:25 compute-0 sudo[72712]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:25 compute-0 sudo[72835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwavjwvavzpiiebssknwhafyuicidjym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096384.454946-828-199876182264296/AnsiballZ_copy.py'
Nov 25 18:46:25 compute-0 sudo[72835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:25 compute-0 python3.9[72837]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096384.454946-828-199876182264296/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:25 compute-0 sudo[72835]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:26 compute-0 sudo[72987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjnsrxblcthszzhhaaskofijwsdzmkbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096385.9736602-858-192180723214779/AnsiballZ_stat.py'
Nov 25 18:46:26 compute-0 sudo[72987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:26 compute-0 python3.9[72989]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:46:26 compute-0 sudo[72987]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:27 compute-0 sudo[73110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzijjmtqqkklngbapgdtpwtklagdqeiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096385.9736602-858-192180723214779/AnsiballZ_copy.py'
Nov 25 18:46:27 compute-0 sudo[73110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:27 compute-0 python3.9[73112]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096385.9736602-858-192180723214779/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:27 compute-0 sudo[73110]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:27 compute-0 sudo[73262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqcmicecheibshsjbbicoxnndhigjivo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096387.512888-888-268211545842527/AnsiballZ_file.py'
Nov 25 18:46:27 compute-0 sudo[73262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:28 compute-0 python3.9[73264]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:28 compute-0 sudo[73262]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:28 compute-0 sudo[73414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnrgpufnanudcwvktjnxoijrxmpdkufo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096388.347019-904-235909830261931/AnsiballZ_command.py'
Nov 25 18:46:28 compute-0 sudo[73414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:28 compute-0 python3.9[73416]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:46:28 compute-0 sudo[73414]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:29 compute-0 sudo[73573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdojjqdyliutkbjbatsoeosdrouqvuph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096389.2300134-920-138431152901258/AnsiballZ_blockinfile.py'
Nov 25 18:46:29 compute-0 sudo[73573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:29 compute-0 python3.9[73575]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:30 compute-0 sudo[73573]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:31 compute-0 sudo[73726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvlusihnkjtzyjfjzaaqrosscxiatqnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096390.7807665-938-94350179716317/AnsiballZ_file.py'
Nov 25 18:46:31 compute-0 sudo[73726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:31 compute-0 python3.9[73728]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:31 compute-0 sudo[73726]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:31 compute-0 sudo[73878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzluriqbtazhnemvygynmyfjvcpmxpiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096391.569003-938-140930265964108/AnsiballZ_file.py'
Nov 25 18:46:31 compute-0 sudo[73878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:32 compute-0 python3.9[73880]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:32 compute-0 sudo[73878]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:33 compute-0 sudo[74030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtplobgayhchrbdchkqeucavuynsajuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096392.3671765-968-184713788350730/AnsiballZ_mount.py'
Nov 25 18:46:33 compute-0 sudo[74030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:33 compute-0 python3.9[74032]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 25 18:46:33 compute-0 sudo[74030]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:33 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 18:46:33 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 18:46:33 compute-0 sudo[74184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocjqpuwmdvlrryikqkbqsnfduhmftbpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096393.4683623-968-15169691333609/AnsiballZ_mount.py'
Nov 25 18:46:33 compute-0 sudo[74184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:34 compute-0 python3.9[74186]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 25 18:46:34 compute-0 sudo[74184]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:34 compute-0 sshd-session[65023]: Connection closed by 192.168.122.30 port 33322
Nov 25 18:46:34 compute-0 sshd-session[65020]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:46:34 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Nov 25 18:46:34 compute-0 systemd[1]: session-15.scope: Consumed 44.587s CPU time.
Nov 25 18:46:34 compute-0 systemd-logind[820]: Session 15 logged out. Waiting for processes to exit.
Nov 25 18:46:34 compute-0 systemd-logind[820]: Removed session 15.
Nov 25 18:46:40 compute-0 sshd-session[74212]: Accepted publickey for zuul from 192.168.122.30 port 46338 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:46:40 compute-0 systemd-logind[820]: New session 16 of user zuul.
Nov 25 18:46:40 compute-0 systemd[1]: Started Session 16 of User zuul.
Nov 25 18:46:40 compute-0 sshd-session[74212]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:46:41 compute-0 sudo[74365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwxgcsihhhqaurkbzncnzqagaucjutjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096400.505191-17-210345430581015/AnsiballZ_tempfile.py'
Nov 25 18:46:41 compute-0 sudo[74365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:41 compute-0 python3.9[74367]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 25 18:46:41 compute-0 sudo[74365]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:41 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 25 18:46:42 compute-0 sudo[74520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggkmlujlmvxvmmjtshxupehqzrbfdcjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096401.5258427-41-147141990559838/AnsiballZ_stat.py'
Nov 25 18:46:42 compute-0 sudo[74520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:42 compute-0 python3.9[74522]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:46:42 compute-0 sudo[74520]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:43 compute-0 sudo[74672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhovxekwqcsxxdejrtjyoakwqatshxda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096402.6844327-61-204027745907735/AnsiballZ_setup.py'
Nov 25 18:46:43 compute-0 sudo[74672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:43 compute-0 python3.9[74674]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:46:43 compute-0 sudo[74672]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:44 compute-0 sudo[74824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmvnxliecxuyrhdzcdapteuzpgoqwxqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096404.0344245-78-239844601170538/AnsiballZ_blockinfile.py'
Nov 25 18:46:44 compute-0 sudo[74824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:44 compute-0 python3.9[74826]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCjc7H9LeG+Lm2eT/iJtIrHXrwdzCvm9UhlUXYB6TjACDWkJeX5FVfrGIpBTTaYwZrfp6ffD8FtNQD6WftP9eAEz+N/qL22S2r5nX6bS+wdviZJ6QbKpcnq5eUQsOZ+cB6abPKhFCBIvqDYH7K+LaarI9Ju/o6b92lRt2kdAHfJyjC2A540RAhxKaJm5AlBxR3eDIE2+wU/DFaWk7Ac8UJiAm8vnNJskFlMAhQzr4w6Evofcz62YIve+40/Uso4hLa43oxgXPWm/Dqw6vMBvNLDzlH9SPmxLpE8TLWABYnoj+tbq7qw+toz4GEcI4L+E7WEnbsOthSfm1WXe8Zji7oFXX6C89pEXjGRh0gRostxeIeB3ylVRndAnuFhEmJ760m2IPhdTaAZZGdtNcqB9LJAI8tDUumId6QKsI53WuBWtEqRpQ+98YUY9JAcEaUM5A4ctMeopuzLEiNSy1ts2ww+Q24LopVmEM2He9TNmxlksxUgfIgYHH43LAcofp+SqJk=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGbSHEbqfpU1YXvg3cZi2GK93VHsc+dUDEd9rS/6j0N4
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBH6NfISc6NAkaIQHNgTR8KJ8pHBCgepvcfIDfDt7ZdxknHiXpuYnYCLL6WxwKH2tigAwiMZi91UKPG2BoNOhzX8=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtj3bzDVr3KXjYBKm6igMdK5RN8i3viVuoqMp3+KGo3HT/Mt+q7uMByQWLLvnouO4zsMepHOUkEfFbrXWSxr6ZjHu2+gFKnen2GjhnidQWi8XFOX2AiMVNpl4/MMU8UiDANdA793MMl+PO5rzakm0Rq1CKK8p/t3iLpM7EaqlMOOoK3JxXYi4B2W2ndfp6VQsX6yhOD37rnPx1vk1pt/jKnC60rya2rLUUOSg2puwS/+W6NMLYw7i0xa9wOBl93k2dUbVOd53/L3+oKZhMggp50YJUgMn6NHiOi0xRQkkvRZiuFF86Z5dvCNNe5EEMWXhmDw6MlcDwm/h5ZUFtZIZfTfOqmV6JZS2Gas1rWtszlrGmJDPvkTikIPDGOvTISavZu409san5E7wIMsLNI2GBoY6j5FE+2GfnumczL7HR2mVB4EWNveQmbtHDMI1rKPz5/zFNTP1/jgqro98QEy1I39cZFP59fhx2LryRv+ZBVjZiCUXEUSA7oxdZvpok6dU=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBz0lEBm4bY1yR+KbMwWHdyDG3RiO5Anp8LqRRgVYEE8
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX0YUz2v+SCYzOZYI9RMj58j5RIJ21KeQEssNgF85/If1o/27pLLQSEMvJ3uhKbcjzWZ8QGxx8+6OqeYqAnOvo=
                                             create=True mode=0644 path=/tmp/ansible.m0xdso94 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:44 compute-0 sudo[74824]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:45 compute-0 sudo[74976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buyntpkrlyokmrsxyiemhdwagpuogooc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096404.9749672-94-95978221551191/AnsiballZ_command.py'
Nov 25 18:46:45 compute-0 sudo[74976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:45 compute-0 python3.9[74978]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.m0xdso94' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:46:45 compute-0 sudo[74976]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:46 compute-0 sudo[75130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwjgjbstaeihwxarvytaeberlrjjkwuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096406.008878-110-78507358391463/AnsiballZ_file.py'
Nov 25 18:46:46 compute-0 sudo[75130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:46 compute-0 python3.9[75132]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.m0xdso94 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:46:46 compute-0 sudo[75130]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:47 compute-0 sshd-session[74215]: Connection closed by 192.168.122.30 port 46338
Nov 25 18:46:47 compute-0 sshd-session[74212]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:46:47 compute-0 systemd-logind[820]: Session 16 logged out. Waiting for processes to exit.
Nov 25 18:46:47 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Nov 25 18:46:47 compute-0 systemd[1]: session-16.scope: Consumed 4.348s CPU time.
Nov 25 18:46:47 compute-0 systemd-logind[820]: Removed session 16.
Nov 25 18:46:52 compute-0 sshd-session[75157]: Accepted publickey for zuul from 192.168.122.30 port 42524 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:46:52 compute-0 systemd-logind[820]: New session 17 of user zuul.
Nov 25 18:46:52 compute-0 systemd[1]: Started Session 17 of User zuul.
Nov 25 18:46:52 compute-0 sshd-session[75157]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:46:53 compute-0 python3.9[75310]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:46:54 compute-0 sudo[75464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyljcertddmtsnkrzleneghnqkhuxzzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096414.1154373-44-99739532502550/AnsiballZ_systemd.py'
Nov 25 18:46:54 compute-0 sudo[75464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:55 compute-0 python3.9[75466]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 25 18:46:55 compute-0 sudo[75464]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:55 compute-0 sudo[75618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvgkdvmvcepxvijvzfnmlkdipmfawpqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096415.4787204-60-105860201958104/AnsiballZ_systemd.py'
Nov 25 18:46:55 compute-0 sudo[75618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:56 compute-0 python3.9[75620]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:46:56 compute-0 sudo[75618]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:56 compute-0 sudo[75771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfpveiuvwonmgzaruewulatnpfozesrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096416.44893-78-18447630250719/AnsiballZ_command.py'
Nov 25 18:46:56 compute-0 sudo[75771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:57 compute-0 python3.9[75773]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:46:57 compute-0 sudo[75771]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:58 compute-0 sudo[75924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jncmfjcuoeeqyaurcmundmlzduiewwdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096417.4865267-94-72511884984524/AnsiballZ_stat.py'
Nov 25 18:46:58 compute-0 sudo[75924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:58 compute-0 python3.9[75926]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:46:58 compute-0 sudo[75924]: pam_unix(sudo:session): session closed for user root
Nov 25 18:46:58 compute-0 sudo[76078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpdeuynsuixjocstwitlsghzakruzhjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096418.5682971-110-23057134889380/AnsiballZ_command.py'
Nov 25 18:46:58 compute-0 sudo[76078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:46:59 compute-0 python3.9[76080]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:46:59 compute-0 sudo[76078]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:00 compute-0 sudo[76233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exebzdytunnhzbkfpuqvqtrcvtnseeye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096419.4630988-126-8519831432854/AnsiballZ_file.py'
Nov 25 18:47:00 compute-0 sudo[76233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:00 compute-0 python3.9[76235]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:00 compute-0 sudo[76233]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:00 compute-0 sshd-session[75160]: Connection closed by 192.168.122.30 port 42524
Nov 25 18:47:00 compute-0 sshd-session[75157]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:47:00 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Nov 25 18:47:00 compute-0 systemd[1]: session-17.scope: Consumed 5.536s CPU time.
Nov 25 18:47:00 compute-0 systemd-logind[820]: Session 17 logged out. Waiting for processes to exit.
Nov 25 18:47:00 compute-0 systemd-logind[820]: Removed session 17.
Nov 25 18:47:05 compute-0 sshd-session[76260]: Accepted publickey for zuul from 192.168.122.30 port 43870 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:47:05 compute-0 systemd-logind[820]: New session 18 of user zuul.
Nov 25 18:47:05 compute-0 systemd[1]: Started Session 18 of User zuul.
Nov 25 18:47:05 compute-0 sshd-session[76260]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:47:06 compute-0 python3.9[76413]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:47:08 compute-0 sudo[76567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cthffpzsevdplytnoqaxravvxmrspdyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096427.6842628-48-16542045326351/AnsiballZ_setup.py'
Nov 25 18:47:08 compute-0 sudo[76567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:08 compute-0 python3.9[76569]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:47:09 compute-0 sudo[76567]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:09 compute-0 sudo[76651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovetrihzrwoknijpnaqnhbevzbssqymw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096427.6842628-48-16542045326351/AnsiballZ_dnf.py'
Nov 25 18:47:09 compute-0 sudo[76651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:10 compute-0 python3.9[76653]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 25 18:47:11 compute-0 sudo[76651]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:12 compute-0 python3.9[76804]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:47:13 compute-0 python3.9[76955]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 18:47:14 compute-0 python3.9[77105]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:47:15 compute-0 python3.9[77255]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:47:16 compute-0 sshd-session[76263]: Connection closed by 192.168.122.30 port 43870
Nov 25 18:47:16 compute-0 sshd-session[76260]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:47:16 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Nov 25 18:47:16 compute-0 systemd[1]: session-18.scope: Consumed 6.592s CPU time.
Nov 25 18:47:16 compute-0 systemd-logind[820]: Session 18 logged out. Waiting for processes to exit.
Nov 25 18:47:16 compute-0 systemd-logind[820]: Removed session 18.
Nov 25 18:47:22 compute-0 sshd-session[77280]: Accepted publickey for zuul from 192.168.122.30 port 45834 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:47:22 compute-0 systemd-logind[820]: New session 19 of user zuul.
Nov 25 18:47:22 compute-0 systemd[1]: Started Session 19 of User zuul.
Nov 25 18:47:22 compute-0 sshd-session[77280]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:47:23 compute-0 python3.9[77433]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:47:25 compute-0 sudo[77587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtclkphzjlshhqdkzvaczbxrkfrmosou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096444.5161364-79-165423134218022/AnsiballZ_file.py'
Nov 25 18:47:25 compute-0 sudo[77587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:25 compute-0 python3.9[77589]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:25 compute-0 sudo[77587]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:26 compute-0 sudo[77739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvvjsixmiiiutgayhdakeylrhpnnidiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096445.7850714-79-51842742942148/AnsiballZ_file.py'
Nov 25 18:47:26 compute-0 sudo[77739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:26 compute-0 python3.9[77741]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:26 compute-0 sudo[77739]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:27 compute-0 sudo[77891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvmtnexrrymkdmcsqqrylvsnlrufiumh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096446.8343687-116-170254923852894/AnsiballZ_stat.py'
Nov 25 18:47:27 compute-0 sudo[77891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:27 compute-0 python3.9[77893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:27 compute-0 sudo[77891]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:27 compute-0 sudo[78014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynrcozvfaktmwrmlpdwolhtrfhsllykw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096446.8343687-116-170254923852894/AnsiballZ_copy.py'
Nov 25 18:47:27 compute-0 sudo[78014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:28 compute-0 python3.9[78016]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096446.8343687-116-170254923852894/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=4d31aaefc6ffb824ae4e75d9435f758da4f08525 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:28 compute-0 sudo[78014]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:28 compute-0 sudo[78166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnojjyrzqtmargadcputzolyytvsmvkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096448.2592027-116-84835378022206/AnsiballZ_stat.py'
Nov 25 18:47:28 compute-0 sudo[78166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:28 compute-0 python3.9[78168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:28 compute-0 sudo[78166]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:29 compute-0 sudo[78289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgkcrchdjilbasjeqrspxitzikqviyke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096448.2592027-116-84835378022206/AnsiballZ_copy.py'
Nov 25 18:47:29 compute-0 sudo[78289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:29 compute-0 python3.9[78291]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096448.2592027-116-84835378022206/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=9b785cd87de94a1f178659de46a2c1328880a6ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:29 compute-0 sudo[78289]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:30 compute-0 sudo[78441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuayxaobqttkbdwtmbxjexdgfchlepmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096449.7567272-116-48060024910841/AnsiballZ_stat.py'
Nov 25 18:47:30 compute-0 sudo[78441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:30 compute-0 python3.9[78443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:30 compute-0 sudo[78441]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:30 compute-0 sudo[78564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukewvgeudmifkagjietkceowcpvzjrhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096449.7567272-116-48060024910841/AnsiballZ_copy.py'
Nov 25 18:47:30 compute-0 sudo[78564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:30 compute-0 python3.9[78566]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096449.7567272-116-48060024910841/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=5ce6b142d7969dadcc79607dac3bb92f6be90385 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:30 compute-0 sudo[78564]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:31 compute-0 sudo[78716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkdtbxzlvyzyjafjhobbdryptwqxkxfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096451.136761-198-118709884869834/AnsiballZ_file.py'
Nov 25 18:47:31 compute-0 sudo[78716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:31 compute-0 python3.9[78718]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:31 compute-0 sudo[78716]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:32 compute-0 sudo[78868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnemepjzcbrqanxpvvvpawndnpsueelq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096452.0170069-198-31291590295808/AnsiballZ_file.py'
Nov 25 18:47:32 compute-0 sudo[78868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:32 compute-0 python3.9[78870]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:32 compute-0 sudo[78868]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:33 compute-0 sudo[79020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqemmqttpxycrszloaedpyutzvnkxmzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096452.6760566-227-235427023130039/AnsiballZ_stat.py'
Nov 25 18:47:33 compute-0 sudo[79020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:33 compute-0 python3.9[79022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:33 compute-0 sudo[79020]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:33 compute-0 sudo[79143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djeipdzpzgslbhldsochdvbwetcjwntx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096452.6760566-227-235427023130039/AnsiballZ_copy.py'
Nov 25 18:47:33 compute-0 sudo[79143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:33 compute-0 python3.9[79145]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096452.6760566-227-235427023130039/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e416efa26958777a5aacebf7ac0f6918dcf8e093 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:33 compute-0 sudo[79143]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:34 compute-0 sudo[79295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tihirhitwaxtmzhazigpbawcktgcjaqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096453.9588833-227-166279767227775/AnsiballZ_stat.py'
Nov 25 18:47:34 compute-0 sudo[79295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:34 compute-0 python3.9[79297]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:34 compute-0 sudo[79295]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:34 compute-0 sudo[79418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeiuteizanedmxvvqeqtwyliboycpxnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096453.9588833-227-166279767227775/AnsiballZ_copy.py'
Nov 25 18:47:34 compute-0 sudo[79418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:35 compute-0 python3.9[79420]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096453.9588833-227-166279767227775/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=155fbc04a7cc656962aee554a3fbf6f0045ca4d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:35 compute-0 sudo[79418]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:35 compute-0 sudo[79570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaoxfympefwtkqdhvmoltwerpntakxvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096455.3427963-227-225095710291221/AnsiballZ_stat.py'
Nov 25 18:47:35 compute-0 sudo[79570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:35 compute-0 python3.9[79572]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:35 compute-0 sudo[79570]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:36 compute-0 sudo[79693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohjsiqpzxkzlkaaqayxflrqkigbtdunt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096455.3427963-227-225095710291221/AnsiballZ_copy.py'
Nov 25 18:47:36 compute-0 sudo[79693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:36 compute-0 python3.9[79695]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096455.3427963-227-225095710291221/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=0f4d2b9a42b2c11e61583ffdd0dbfaa529bc5569 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:36 compute-0 sudo[79693]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:37 compute-0 sudo[79845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghgbcsfybhsrvryorwciptrnrrfvwnmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096456.9293625-314-231916434330444/AnsiballZ_file.py'
Nov 25 18:47:37 compute-0 sudo[79845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:37 compute-0 chronyd[64994]: Selected source 142.4.192.253 (pool.ntp.org)
Nov 25 18:47:37 compute-0 python3.9[79847]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:37 compute-0 sudo[79845]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:38 compute-0 sudo[79997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbuihfwmxdjyobqspausfmybfjbdnlyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096457.6895213-314-33390429403632/AnsiballZ_file.py'
Nov 25 18:47:38 compute-0 sudo[79997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:38 compute-0 python3.9[79999]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:38 compute-0 sudo[79997]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:38 compute-0 sudo[80149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azygvpycpldjsnqgqugdvhijqhwjbhxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096458.48328-345-255847067387927/AnsiballZ_stat.py'
Nov 25 18:47:38 compute-0 sudo[80149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:39 compute-0 python3.9[80151]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:39 compute-0 sudo[80149]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:39 compute-0 sudo[80272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypdoejwrlsuyceywdzpxzknzdiyrqujp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096458.48328-345-255847067387927/AnsiballZ_copy.py'
Nov 25 18:47:39 compute-0 sudo[80272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:39 compute-0 python3.9[80274]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096458.48328-345-255847067387927/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=026506eb20e3a65bca557f8a5c0f9b15e6a0693b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:39 compute-0 sudo[80272]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:40 compute-0 sudo[80424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzttfxgkaaeqqewlntbfybqxkhtbnsyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096460.0032647-345-168617554999537/AnsiballZ_stat.py'
Nov 25 18:47:40 compute-0 sudo[80424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:40 compute-0 python3.9[80426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:40 compute-0 sudo[80424]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:40 compute-0 sudo[80547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybjmzggpvgbvcuebovjnugjnuslyukad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096460.0032647-345-168617554999537/AnsiballZ_copy.py'
Nov 25 18:47:40 compute-0 sudo[80547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:41 compute-0 python3.9[80549]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096460.0032647-345-168617554999537/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a9c654b2e34167728592a477648f3dbfc4b73b67 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:41 compute-0 sudo[80547]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:41 compute-0 sudo[80699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bseorigdtwtvumouatgvegppwbfgvskx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096461.340041-345-182417704191572/AnsiballZ_stat.py'
Nov 25 18:47:41 compute-0 sudo[80699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:41 compute-0 python3.9[80701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:41 compute-0 sudo[80699]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:42 compute-0 sudo[80822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttxfxbddougqoigbrwejhqknkzvlmiha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096461.340041-345-182417704191572/AnsiballZ_copy.py'
Nov 25 18:47:42 compute-0 sudo[80822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:42 compute-0 python3.9[80824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096461.340041-345-182417704191572/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=cda73d91aaf2f90b86c30071e157891bab41c8ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:42 compute-0 sudo[80822]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:43 compute-0 sudo[80974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uychnplwqepizhsefwenyhrghrlqzhsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096462.7604144-432-34906253001081/AnsiballZ_file.py'
Nov 25 18:47:43 compute-0 sudo[80974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:43 compute-0 python3.9[80976]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:43 compute-0 sudo[80974]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:43 compute-0 sudo[81126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebzcrdjtsewqvcqaggrxngphkkcabfwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096463.4752238-432-198266542483367/AnsiballZ_file.py'
Nov 25 18:47:43 compute-0 sudo[81126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:44 compute-0 python3.9[81128]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:44 compute-0 sudo[81126]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:44 compute-0 sudo[81278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mybssnnokifrwqchddtpiucmidomawpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096464.239565-463-184401607242114/AnsiballZ_stat.py'
Nov 25 18:47:44 compute-0 sudo[81278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:44 compute-0 python3.9[81280]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:44 compute-0 sudo[81278]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:45 compute-0 sudo[81401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlotsbcwffholsgwhkjbbqohoilpqgox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096464.239565-463-184401607242114/AnsiballZ_copy.py'
Nov 25 18:47:45 compute-0 sudo[81401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:45 compute-0 python3.9[81403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096464.239565-463-184401607242114/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=93f5a3bc27b607c9efae719eb8612d3d99b0b9af backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:45 compute-0 sudo[81401]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:46 compute-0 sudo[81553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyikorpnygzinpepnwuofkjdhkgsmlfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096465.729837-463-105797739253907/AnsiballZ_stat.py'
Nov 25 18:47:46 compute-0 sudo[81553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:46 compute-0 python3.9[81555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:46 compute-0 sudo[81553]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:46 compute-0 sudo[81676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgphegnpeovgwhwfcthpsokntsdnfwfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096465.729837-463-105797739253907/AnsiballZ_copy.py'
Nov 25 18:47:46 compute-0 sudo[81676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:46 compute-0 python3.9[81678]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096465.729837-463-105797739253907/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a9c654b2e34167728592a477648f3dbfc4b73b67 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:47 compute-0 sudo[81676]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:47 compute-0 sudo[81828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duscajyldgeastlhlhtjjxancxtqszgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096467.2135715-463-246380506153013/AnsiballZ_stat.py'
Nov 25 18:47:47 compute-0 sudo[81828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:47 compute-0 python3.9[81830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:47 compute-0 sudo[81828]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:48 compute-0 sudo[81951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcvzboeaqhtekltiztnowxrhrkmzszzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096467.2135715-463-246380506153013/AnsiballZ_copy.py'
Nov 25 18:47:48 compute-0 sudo[81951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:48 compute-0 python3.9[81953]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096467.2135715-463-246380506153013/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=78cc27798f80cdf3ffa3a9a4a61be4479ddafe4c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:48 compute-0 sudo[81951]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:50 compute-0 sudo[82103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fypzxuyysqvxefvbcruvblwgtfxaochb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096469.745364-591-88630315630520/AnsiballZ_file.py'
Nov 25 18:47:50 compute-0 sudo[82103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:50 compute-0 python3.9[82105]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:50 compute-0 sudo[82103]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:50 compute-0 sudo[82255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaozvhsblycxoqbznppvhsqjctpmftju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096470.4929464-607-122177177346454/AnsiballZ_stat.py'
Nov 25 18:47:50 compute-0 sudo[82255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:51 compute-0 python3.9[82257]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:51 compute-0 sudo[82255]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:51 compute-0 sudo[82378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjdtdplylgjwzgtphgpgjznmllptmbbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096470.4929464-607-122177177346454/AnsiballZ_copy.py'
Nov 25 18:47:51 compute-0 sudo[82378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:51 compute-0 python3.9[82380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096470.4929464-607-122177177346454/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=110188508d39de0258c0959e3bc941a100e6a11a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:51 compute-0 sudo[82378]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:52 compute-0 sudo[82530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qobuzdynnvwajtqizzkgobglizsslxxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096471.9457922-639-152005981083030/AnsiballZ_file.py'
Nov 25 18:47:52 compute-0 sudo[82530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:52 compute-0 python3.9[82532]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:52 compute-0 sudo[82530]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:53 compute-0 sudo[82682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbtghyvmybcwcvnyyfciivhgcugyykep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096472.675863-655-230171174617964/AnsiballZ_stat.py'
Nov 25 18:47:53 compute-0 sudo[82682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:53 compute-0 python3.9[82684]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:53 compute-0 sudo[82682]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:53 compute-0 sudo[82805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfebagzhgcpyizrduzirnutzwhmxyidf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096472.675863-655-230171174617964/AnsiballZ_copy.py'
Nov 25 18:47:53 compute-0 sudo[82805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:54 compute-0 python3.9[82807]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096472.675863-655-230171174617964/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=110188508d39de0258c0959e3bc941a100e6a11a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:54 compute-0 sudo[82805]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:54 compute-0 sudo[82957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icjcstixyevhwbpprzxqytgewqrmhwfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096474.2761307-688-204630989308522/AnsiballZ_file.py'
Nov 25 18:47:54 compute-0 sudo[82957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:54 compute-0 python3.9[82959]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:54 compute-0 sudo[82957]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:55 compute-0 sudo[83109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tszibmyeahzcbtqqijrzfoojgkvcdruw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096475.083395-704-260002787240579/AnsiballZ_stat.py'
Nov 25 18:47:55 compute-0 sudo[83109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:55 compute-0 python3.9[83111]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:55 compute-0 sudo[83109]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:56 compute-0 sudo[83232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzcjladjxecpsttcawwojnksveielrzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096475.083395-704-260002787240579/AnsiballZ_copy.py'
Nov 25 18:47:56 compute-0 sudo[83232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:56 compute-0 python3.9[83234]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096475.083395-704-260002787240579/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=110188508d39de0258c0959e3bc941a100e6a11a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:56 compute-0 sudo[83232]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:57 compute-0 sudo[83384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osxylufjrfqxjrskgesitofrnwyxdzge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096476.7842877-735-230004501296004/AnsiballZ_file.py'
Nov 25 18:47:57 compute-0 sudo[83384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:57 compute-0 python3.9[83386]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:57 compute-0 sudo[83384]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:58 compute-0 sudo[83536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itvyfequpjwfsyzktebtzeliiadhjgfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096477.6285589-753-206801263389528/AnsiballZ_stat.py'
Nov 25 18:47:58 compute-0 sudo[83536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:58 compute-0 python3.9[83538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:47:58 compute-0 sudo[83536]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:58 compute-0 sudo[83659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrfozwcngrxzmumymfmxhsfdzupsvbck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096477.6285589-753-206801263389528/AnsiballZ_copy.py'
Nov 25 18:47:58 compute-0 sudo[83659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:58 compute-0 python3.9[83661]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096477.6285589-753-206801263389528/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=110188508d39de0258c0959e3bc941a100e6a11a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:47:58 compute-0 sudo[83659]: pam_unix(sudo:session): session closed for user root
Nov 25 18:47:59 compute-0 sudo[83811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zarjxtowjihxjzcoriqcstzwumxoszzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096479.1527903-784-250890496443913/AnsiballZ_file.py'
Nov 25 18:47:59 compute-0 sudo[83811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:47:59 compute-0 python3.9[83813]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:47:59 compute-0 sudo[83811]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:00 compute-0 sudo[83963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpivjkifgysvazhwwzpxscyapltrulju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096479.995781-800-249827064758685/AnsiballZ_stat.py'
Nov 25 18:48:00 compute-0 sudo[83963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:00 compute-0 python3.9[83965]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:00 compute-0 sudo[83963]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:01 compute-0 sudo[84086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gguoeqnficloyphmfmjdgpzmvmjqttae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096479.995781-800-249827064758685/AnsiballZ_copy.py'
Nov 25 18:48:01 compute-0 sudo[84086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:01 compute-0 python3.9[84088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096479.995781-800-249827064758685/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=110188508d39de0258c0959e3bc941a100e6a11a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:01 compute-0 sudo[84086]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:01 compute-0 sudo[84238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzxhzgsrguhlcceudaqwzqpfadcyffvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096481.617187-831-147620156848935/AnsiballZ_file.py'
Nov 25 18:48:01 compute-0 sudo[84238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:02 compute-0 python3.9[84240]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:48:02 compute-0 sudo[84238]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:02 compute-0 sudo[84390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgkelmkeznxkjwzdxwbvcnqusjnxcijr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096482.4392898-847-75810501413770/AnsiballZ_stat.py'
Nov 25 18:48:02 compute-0 sudo[84390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:02 compute-0 python3.9[84392]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:02 compute-0 sudo[84390]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:03 compute-0 sudo[84513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giacqchvdlywfqabiiinxhzihyqedsse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096482.4392898-847-75810501413770/AnsiballZ_copy.py'
Nov 25 18:48:03 compute-0 sudo[84513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:03 compute-0 python3.9[84515]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096482.4392898-847-75810501413770/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=110188508d39de0258c0959e3bc941a100e6a11a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:03 compute-0 sudo[84513]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:04 compute-0 sudo[84665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnngevxtopibayxehkqwinxrmsxalwpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096483.9971907-879-221460569095896/AnsiballZ_file.py'
Nov 25 18:48:04 compute-0 sudo[84665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:04 compute-0 python3.9[84667]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:48:04 compute-0 sudo[84665]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:05 compute-0 sudo[84817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyuiuvhzjzwdfcgirccpewrfertkubnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096484.7889555-895-142878288623740/AnsiballZ_stat.py'
Nov 25 18:48:05 compute-0 sudo[84817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:05 compute-0 python3.9[84819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:05 compute-0 sudo[84817]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:05 compute-0 sudo[84940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcfmjbzsingljtuzxdoewmugrccxdnbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096484.7889555-895-142878288623740/AnsiballZ_copy.py'
Nov 25 18:48:05 compute-0 sudo[84940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:06 compute-0 python3.9[84942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096484.7889555-895-142878288623740/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=110188508d39de0258c0959e3bc941a100e6a11a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:06 compute-0 sudo[84940]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:06 compute-0 sshd-session[77283]: Connection closed by 192.168.122.30 port 45834
Nov 25 18:48:06 compute-0 sshd-session[77280]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:48:06 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Nov 25 18:48:06 compute-0 systemd[1]: session-19.scope: Consumed 34.767s CPU time.
Nov 25 18:48:06 compute-0 systemd-logind[820]: Session 19 logged out. Waiting for processes to exit.
Nov 25 18:48:06 compute-0 systemd-logind[820]: Removed session 19.
Nov 25 18:48:12 compute-0 sshd-session[84967]: Accepted publickey for zuul from 192.168.122.30 port 60336 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:48:12 compute-0 systemd-logind[820]: New session 20 of user zuul.
Nov 25 18:48:12 compute-0 systemd[1]: Started Session 20 of User zuul.
Nov 25 18:48:12 compute-0 sshd-session[84967]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:48:13 compute-0 python3.9[85120]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:48:14 compute-0 sudo[85274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdwueclcpcjpllozvctopnardmhgqgnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096494.0100815-48-73773796264395/AnsiballZ_file.py'
Nov 25 18:48:14 compute-0 sudo[85274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:14 compute-0 python3.9[85276]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:48:14 compute-0 sudo[85274]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:15 compute-0 sudo[85426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfbvlgttfvdzwqurqabvhywmuckccwky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096495.000009-48-83081042703244/AnsiballZ_file.py'
Nov 25 18:48:15 compute-0 sudo[85426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:15 compute-0 python3.9[85428]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:48:15 compute-0 sudo[85426]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:16 compute-0 python3.9[85578]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:48:17 compute-0 sudo[85728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjovpckxqyuylfamgxdjsibxuellgccp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096496.784829-94-235363492569398/AnsiballZ_seboolean.py'
Nov 25 18:48:17 compute-0 sudo[85728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:17 compute-0 python3.9[85730]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 25 18:48:18 compute-0 sudo[85728]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:19 compute-0 sudo[85884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcjyxckxrrfwuwgaspffvpngfdzjewra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096499.3352795-114-135421177715920/AnsiballZ_setup.py'
Nov 25 18:48:19 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 25 18:48:19 compute-0 sudo[85884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:20 compute-0 python3.9[85886]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:48:20 compute-0 sudo[85884]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:20 compute-0 sudo[85968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzsbhujyczsjeezusttwlqhcqspbtmou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096499.3352795-114-135421177715920/AnsiballZ_dnf.py'
Nov 25 18:48:20 compute-0 sudo[85968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:21 compute-0 python3.9[85970]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:48:22 compute-0 sudo[85968]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:23 compute-0 sudo[86121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzaukxfgdjanpkpsrhbdmfojkepnopns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096502.570229-138-15075198809421/AnsiballZ_systemd.py'
Nov 25 18:48:23 compute-0 sudo[86121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:23 compute-0 python3.9[86123]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 18:48:23 compute-0 sudo[86121]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:24 compute-0 sudo[86276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mppgauykjuysmeixexuraicyoaklwmiz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764096504.073424-154-273058428650501/AnsiballZ_edpm_nftables_snippet.py'
Nov 25 18:48:24 compute-0 sudo[86276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:24 compute-0 python3[86278]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 25 18:48:24 compute-0 sudo[86276]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:25 compute-0 sudo[86428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngjsccshmwrinsbmukhrjtlpgnapcbqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096505.3047493-172-238713722064752/AnsiballZ_file.py'
Nov 25 18:48:25 compute-0 sudo[86428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:25 compute-0 python3.9[86430]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:25 compute-0 sudo[86428]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:26 compute-0 sudo[86580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvatbsuqnxwwnkuxyauohlzbpffyeblc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096506.1568105-188-27554620686698/AnsiballZ_stat.py'
Nov 25 18:48:26 compute-0 sudo[86580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:26 compute-0 python3.9[86582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:27 compute-0 sudo[86580]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:27 compute-0 sudo[86658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwcgtyusbnleshqgizuauylumpdbbpxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096506.1568105-188-27554620686698/AnsiballZ_file.py'
Nov 25 18:48:27 compute-0 sudo[86658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:27 compute-0 python3.9[86660]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:27 compute-0 sudo[86658]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:28 compute-0 sudo[86810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zznpzkrxtxprjcljojihiqhvwwxczoeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096507.8012316-212-238566111025921/AnsiballZ_stat.py'
Nov 25 18:48:28 compute-0 sudo[86810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:28 compute-0 python3.9[86812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:28 compute-0 sudo[86810]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:28 compute-0 sudo[86888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stwylimkxcehxklcccatvtdlleiwfqic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096507.8012316-212-238566111025921/AnsiballZ_file.py'
Nov 25 18:48:28 compute-0 sudo[86888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:28 compute-0 python3.9[86890]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.cnb67jv2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:28 compute-0 sudo[86888]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:29 compute-0 sudo[87040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avbhooyzcqmfwkepfdxgqgwyujcztiqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096509.1988225-236-212971390536761/AnsiballZ_stat.py'
Nov 25 18:48:29 compute-0 sudo[87040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:29 compute-0 python3.9[87042]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:29 compute-0 sudo[87040]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:30 compute-0 sudo[87118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btqlpyiuyjdwajnjgmugekfthyhxrazx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096509.1988225-236-212971390536761/AnsiballZ_file.py'
Nov 25 18:48:30 compute-0 sudo[87118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:30 compute-0 python3.9[87120]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:30 compute-0 sudo[87118]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:31 compute-0 sudo[87270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aalimyhitqctbgndeormcdfafmmnprzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096510.6735811-262-172280214888005/AnsiballZ_command.py'
Nov 25 18:48:31 compute-0 sudo[87270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:31 compute-0 python3.9[87272]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:48:31 compute-0 sudo[87270]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:32 compute-0 sudo[87423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdillqarwaazzekzzsrwtfddqteuyibo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764096511.7246962-278-200292167256403/AnsiballZ_edpm_nftables_from_files.py'
Nov 25 18:48:32 compute-0 sudo[87423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:32 compute-0 python3[87425]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 18:48:32 compute-0 sudo[87423]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:33 compute-0 sudo[87575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bltwutuhsqmkhrdxdoqgivrlbjjxayyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096512.724699-294-261928025595377/AnsiballZ_stat.py'
Nov 25 18:48:33 compute-0 sudo[87575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:33 compute-0 python3.9[87577]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:33 compute-0 sudo[87575]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:33 compute-0 sudo[87700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doizaqcqhxiibtdklkakolrgsttoxxha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096512.724699-294-261928025595377/AnsiballZ_copy.py'
Nov 25 18:48:33 compute-0 sudo[87700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:34 compute-0 python3.9[87702]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096512.724699-294-261928025595377/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:34 compute-0 sudo[87700]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:34 compute-0 sudo[87852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzaewcmzqlbbvzfhjjxdcwiinldxaxwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096514.4045606-324-195927521111348/AnsiballZ_stat.py'
Nov 25 18:48:34 compute-0 sudo[87852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:35 compute-0 python3.9[87854]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:35 compute-0 sudo[87852]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:35 compute-0 sudo[87977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anrvfajzocksbsjeguvfijmquqmphghc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096514.4045606-324-195927521111348/AnsiballZ_copy.py'
Nov 25 18:48:35 compute-0 sudo[87977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:35 compute-0 python3.9[87979]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096514.4045606-324-195927521111348/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:35 compute-0 sudo[87977]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:36 compute-0 sudo[88129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohvfymmcgjurykbzaqvbrdamvhqtyzbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096516.0248246-354-70548032608928/AnsiballZ_stat.py'
Nov 25 18:48:36 compute-0 sudo[88129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:36 compute-0 python3.9[88131]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:36 compute-0 sudo[88129]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:37 compute-0 sudo[88254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnsbdqmtzpvaofdhhqmyulljrfdtzqgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096516.0248246-354-70548032608928/AnsiballZ_copy.py'
Nov 25 18:48:37 compute-0 sudo[88254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:37 compute-0 python3.9[88256]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096516.0248246-354-70548032608928/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:37 compute-0 sudo[88254]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:38 compute-0 sudo[88406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpqdjlfaejhwfqijilficvwesnyzzbbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096517.6797938-384-131791503595367/AnsiballZ_stat.py'
Nov 25 18:48:38 compute-0 sudo[88406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:38 compute-0 python3.9[88408]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:38 compute-0 sudo[88406]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:38 compute-0 sudo[88531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqfcyzvqaysgpeefebdyemaxnavxgztf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096517.6797938-384-131791503595367/AnsiballZ_copy.py'
Nov 25 18:48:38 compute-0 sudo[88531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:38 compute-0 python3.9[88533]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096517.6797938-384-131791503595367/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:38 compute-0 sudo[88531]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:39 compute-0 sudo[88683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlubjhbhgrnpyqjuqavkqxikvcvclodm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096519.268978-414-190146594740177/AnsiballZ_stat.py'
Nov 25 18:48:39 compute-0 sudo[88683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:39 compute-0 python3.9[88685]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:39 compute-0 sudo[88683]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:40 compute-0 sudo[88808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dinskymblughtkwajtiddpbnceenapyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096519.268978-414-190146594740177/AnsiballZ_copy.py'
Nov 25 18:48:40 compute-0 sudo[88808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:40 compute-0 python3.9[88810]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096519.268978-414-190146594740177/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:40 compute-0 sudo[88808]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:41 compute-0 sudo[88960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edaqtjsglinfoplddogqdtwpzbnadehl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096520.897526-444-268977594836970/AnsiballZ_file.py'
Nov 25 18:48:41 compute-0 sudo[88960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:41 compute-0 python3.9[88962]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:41 compute-0 sudo[88960]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:42 compute-0 sudo[89112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiphkitigktnfmxkxshkcpvxvlrtftmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096521.718503-460-14507137291563/AnsiballZ_command.py'
Nov 25 18:48:42 compute-0 sudo[89112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:42 compute-0 python3.9[89114]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:48:42 compute-0 sudo[89112]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:43 compute-0 sudo[89267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxvldwjwxltqgmaodpqinfvgzobgbopt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096522.6272638-476-269964559815074/AnsiballZ_blockinfile.py'
Nov 25 18:48:43 compute-0 sudo[89267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:43 compute-0 python3.9[89269]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:43 compute-0 sudo[89267]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:44 compute-0 sudo[89419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhnnastwtlkkkbgwnenjemeiouafjzse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096523.75507-494-207136652315276/AnsiballZ_command.py'
Nov 25 18:48:44 compute-0 sudo[89419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:44 compute-0 python3.9[89421]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:48:44 compute-0 sudo[89419]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:44 compute-0 sudo[89572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uysoibhpukavhuquposxqichuuqyjgwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096524.611655-510-231043185932208/AnsiballZ_stat.py'
Nov 25 18:48:44 compute-0 sudo[89572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:45 compute-0 python3.9[89574]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:48:45 compute-0 sudo[89572]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:45 compute-0 sudo[89726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqbjdrdzlwwbboxzwfsjyskndmgauosw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096525.4381213-526-84829969906001/AnsiballZ_command.py'
Nov 25 18:48:45 compute-0 sudo[89726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:46 compute-0 python3.9[89728]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:48:46 compute-0 sudo[89726]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:46 compute-0 sudo[89881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyfidicaxrrzlvrnbcdylkzbvtufrazc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096526.3587499-542-269156359804430/AnsiballZ_file.py'
Nov 25 18:48:46 compute-0 sudo[89881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:46 compute-0 python3.9[89883]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:46 compute-0 sudo[89881]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:48 compute-0 python3.9[90033]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:48:49 compute-0 sudo[90184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muyfvvgnnozidyhswcwbkmjevfswqyre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096528.878744-622-155958401735133/AnsiballZ_command.py'
Nov 25 18:48:49 compute-0 sudo[90184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:49 compute-0 python3.9[90186]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:48:49 compute-0 ovs-vsctl[90187]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 25 18:48:49 compute-0 sudo[90184]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:50 compute-0 sudo[90337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efhuxqanhptqmweriwkzsqiqmewnptza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096529.7965043-640-17248288267166/AnsiballZ_command.py'
Nov 25 18:48:50 compute-0 sudo[90337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:50 compute-0 python3.9[90339]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:48:50 compute-0 sudo[90337]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:51 compute-0 sudo[90492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onrxpprpfkupvwgxxavhheruhluzmins ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096530.647826-656-15126285089373/AnsiballZ_command.py'
Nov 25 18:48:51 compute-0 sudo[90492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:51 compute-0 python3.9[90494]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:48:51 compute-0 ovs-vsctl[90495]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 25 18:48:51 compute-0 sudo[90492]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:52 compute-0 python3.9[90645]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:48:52 compute-0 sudo[90797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygnltcppohpijjbzlabfhimofrnhqwni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096532.4919286-690-244701688077568/AnsiballZ_file.py'
Nov 25 18:48:52 compute-0 sudo[90797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:53 compute-0 python3.9[90799]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:48:53 compute-0 sudo[90797]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:53 compute-0 sudo[90949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktxrnaistmsmnuqfpjqcunoblogvurdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096533.391261-706-269338442354363/AnsiballZ_stat.py'
Nov 25 18:48:53 compute-0 sudo[90949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:53 compute-0 python3.9[90951]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:54 compute-0 sudo[90949]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:54 compute-0 sudo[91027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zekzniaqgybkvmiiaehucwhvkehuomgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096533.391261-706-269338442354363/AnsiballZ_file.py'
Nov 25 18:48:54 compute-0 sudo[91027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:54 compute-0 python3.9[91029]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:48:54 compute-0 sudo[91027]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:55 compute-0 sudo[91179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qortsjhsoaipldxezvgucgdikzmbsxzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096534.7578578-706-164257962067668/AnsiballZ_stat.py'
Nov 25 18:48:55 compute-0 sudo[91179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:55 compute-0 python3.9[91181]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:55 compute-0 sudo[91179]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:55 compute-0 sudo[91257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgnumvyoupaagrefprwrjtwlroufunth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096534.7578578-706-164257962067668/AnsiballZ_file.py'
Nov 25 18:48:55 compute-0 sudo[91257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:55 compute-0 python3.9[91259]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:48:55 compute-0 sudo[91257]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:56 compute-0 sudo[91409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxqatpxclvrggcbxowveejmwpeatdfxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096536.1407151-752-36493259418623/AnsiballZ_file.py'
Nov 25 18:48:56 compute-0 sudo[91409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:56 compute-0 python3.9[91411]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:56 compute-0 sudo[91409]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:57 compute-0 sudo[91561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jemykejeyfufkzpuhrhpllsmjtiuaiou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096537.0242803-768-193622514026753/AnsiballZ_stat.py'
Nov 25 18:48:57 compute-0 sudo[91561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:57 compute-0 python3.9[91563]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:57 compute-0 sudo[91561]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:57 compute-0 sudo[91639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fupaezxmappjktemqmfxztrwxjwwkhvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096537.0242803-768-193622514026753/AnsiballZ_file.py'
Nov 25 18:48:58 compute-0 sudo[91639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:58 compute-0 python3.9[91641]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:58 compute-0 sudo[91639]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:58 compute-0 sudo[91791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qravzrkxuhvzcchggiybtstacthwtffw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096538.4608395-792-124872303266301/AnsiballZ_stat.py'
Nov 25 18:48:58 compute-0 sudo[91791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:59 compute-0 python3.9[91793]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:48:59 compute-0 sudo[91791]: pam_unix(sudo:session): session closed for user root
Nov 25 18:48:59 compute-0 sudo[91869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaoslfwnnnckfqhurzlclcpcidrqxnid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096538.4608395-792-124872303266301/AnsiballZ_file.py'
Nov 25 18:48:59 compute-0 sudo[91869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:48:59 compute-0 python3.9[91871]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:48:59 compute-0 sudo[91869]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:00 compute-0 sudo[92021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbwnlfmclqzkxiqhnsiukawnyhmxkzfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096539.8212905-816-87214117583544/AnsiballZ_systemd.py'
Nov 25 18:49:00 compute-0 sudo[92021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:00 compute-0 python3.9[92023]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:49:00 compute-0 systemd[1]: Reloading.
Nov 25 18:49:00 compute-0 systemd-rc-local-generator[92048]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:49:00 compute-0 systemd-sysv-generator[92054]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:49:00 compute-0 sudo[92021]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:01 compute-0 sudo[92211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoszxmrinidvkiecjpojrcrrqkmmiaqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096541.1197596-832-37514701032978/AnsiballZ_stat.py'
Nov 25 18:49:01 compute-0 sudo[92211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:01 compute-0 python3.9[92213]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:01 compute-0 sudo[92211]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:02 compute-0 sudo[92289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsljwsnbgnxwlvantmtmyympzolgpovr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096541.1197596-832-37514701032978/AnsiballZ_file.py'
Nov 25 18:49:02 compute-0 sudo[92289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:02 compute-0 python3.9[92291]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:49:02 compute-0 sudo[92289]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:02 compute-0 sudo[92441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtgrquwqemyaalfoleeknrwtlaojywqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096542.5205164-856-154147988657967/AnsiballZ_stat.py'
Nov 25 18:49:02 compute-0 sudo[92441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:03 compute-0 python3.9[92443]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:03 compute-0 sudo[92441]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:03 compute-0 sudo[92519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npkqbfoeyerridkjtqttlpvsfvnhmywa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096542.5205164-856-154147988657967/AnsiballZ_file.py'
Nov 25 18:49:03 compute-0 sudo[92519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:03 compute-0 python3.9[92521]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:49:03 compute-0 sudo[92519]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:04 compute-0 sudo[92671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaugueswsdcgmpczcfrfyjkdhllhluan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096543.8938997-880-216554557359979/AnsiballZ_systemd.py'
Nov 25 18:49:04 compute-0 sudo[92671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:04 compute-0 python3.9[92673]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:49:04 compute-0 systemd[1]: Reloading.
Nov 25 18:49:04 compute-0 systemd-rc-local-generator[92701]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:49:04 compute-0 systemd-sysv-generator[92704]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:49:04 compute-0 systemd[1]: Starting Create netns directory...
Nov 25 18:49:04 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 18:49:04 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 18:49:04 compute-0 systemd[1]: Finished Create netns directory.
Nov 25 18:49:05 compute-0 sudo[92671]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:05 compute-0 sudo[92865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbsjdafczootezyiqezqjizyeowwpblz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096545.4069457-900-269312634862650/AnsiballZ_file.py'
Nov 25 18:49:05 compute-0 sudo[92865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:05 compute-0 python3.9[92867]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:06 compute-0 sudo[92865]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:06 compute-0 sudo[93017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmradktfupdsmxhyilxqszlxtybnxctr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096546.2964823-916-139282533604388/AnsiballZ_stat.py'
Nov 25 18:49:06 compute-0 sudo[93017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:06 compute-0 python3.9[93019]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:06 compute-0 sudo[93017]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:07 compute-0 sudo[93140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whkfunsznniobfkjcqeqvygrfakgpqcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096546.2964823-916-139282533604388/AnsiballZ_copy.py'
Nov 25 18:49:07 compute-0 sudo[93140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:07 compute-0 python3.9[93142]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096546.2964823-916-139282533604388/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:07 compute-0 sudo[93140]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:08 compute-0 sudo[93292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwhupgnnycdyosfplqotznqqfqzeypse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096548.2082055-950-25336626530314/AnsiballZ_file.py'
Nov 25 18:49:08 compute-0 sudo[93292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:08 compute-0 python3.9[93294]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:08 compute-0 sudo[93292]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:09 compute-0 sudo[93444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrglzvpdrrwvqnahhxqbghrmlcftyznw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096549.123447-966-91410446133857/AnsiballZ_stat.py'
Nov 25 18:49:09 compute-0 sudo[93444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:09 compute-0 python3.9[93446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:09 compute-0 sudo[93444]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:10 compute-0 sudo[93567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzynyrsffovkyvjupsncvolmjwoomcay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096549.123447-966-91410446133857/AnsiballZ_copy.py'
Nov 25 18:49:10 compute-0 sudo[93567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:10 compute-0 python3.9[93569]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096549.123447-966-91410446133857/.source.json _original_basename=.3rxbodq3 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:49:10 compute-0 sudo[93567]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:10 compute-0 sudo[93719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwkzqprwxkdwephcscryyzirkgzummtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096550.6218858-996-130081313188425/AnsiballZ_file.py'
Nov 25 18:49:10 compute-0 sudo[93719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:11 compute-0 python3.9[93721]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:49:11 compute-0 sudo[93719]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:11 compute-0 sudo[93871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlxoqitxjnxdpmensncmhknleaayhhff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096551.4529037-1012-146251472742874/AnsiballZ_stat.py'
Nov 25 18:49:11 compute-0 sudo[93871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:12 compute-0 sudo[93871]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:12 compute-0 sudo[93994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdrrqrncspmqtuecgxlsvcneonwudzwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096551.4529037-1012-146251472742874/AnsiballZ_copy.py'
Nov 25 18:49:12 compute-0 sudo[93994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:12 compute-0 sudo[93994]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:13 compute-0 sudo[94146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfxrduhynjmcvkxpofnnvschcchyciie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096553.1221654-1046-203911684592881/AnsiballZ_container_config_data.py'
Nov 25 18:49:13 compute-0 sudo[94146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:13 compute-0 python3.9[94148]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 25 18:49:13 compute-0 sudo[94146]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:14 compute-0 sudo[94298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grdulovvbcwnqbzoyftqojlbqlfbhvir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096554.2527068-1064-192628219443574/AnsiballZ_container_config_hash.py'
Nov 25 18:49:14 compute-0 sudo[94298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:14 compute-0 python3.9[94300]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 18:49:14 compute-0 sudo[94298]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:15 compute-0 sudo[94450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paxyqcsxhcxfrhmhztqojuajkhzqzedt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096555.264641-1082-108045874073425/AnsiballZ_podman_container_info.py'
Nov 25 18:49:15 compute-0 sudo[94450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:15 compute-0 python3.9[94452]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 25 18:49:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:49:16 compute-0 sudo[94450]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:17 compute-0 sudo[94613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgsdpbwnasrewynruhpnlmswipnotmjc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764096556.8103242-1108-77025626663144/AnsiballZ_edpm_container_manage.py'
Nov 25 18:49:17 compute-0 sudo[94613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:17 compute-0 python3[94615]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 18:49:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:49:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:49:17 compute-0 podman[94653]: 2025-11-25 18:49:17.994824179 +0000 UTC m=+0.072965762 container create 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 18:49:17 compute-0 podman[94653]: 2025-11-25 18:49:17.957353928 +0000 UTC m=+0.035495551 image pull 6a8194dc5cbc0b30fb087899a1cd17693ec8d18197c75e9f4cc0e4bdb35c6c1c 38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Nov 25 18:49:18 compute-0 python3[94615]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Nov 25 18:49:18 compute-0 sudo[94613]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 25 18:49:19 compute-0 sudo[94840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntzvglyyirftgrtufvazfynsegtutwjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096558.9878647-1124-183396292490162/AnsiballZ_stat.py'
Nov 25 18:49:19 compute-0 sudo[94840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:19 compute-0 python3.9[94842]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:49:19 compute-0 sudo[94840]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:20 compute-0 sudo[94995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egggoltohcjhuewnzeskvvhrswgvwkqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096559.954313-1142-44442813883387/AnsiballZ_file.py'
Nov 25 18:49:20 compute-0 sudo[94995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:20 compute-0 python3.9[94997]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:49:20 compute-0 sudo[94995]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:20 compute-0 sudo[95071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haqypppjysnynjizdeefbguhpjuzhwxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096559.954313-1142-44442813883387/AnsiballZ_stat.py'
Nov 25 18:49:20 compute-0 sudo[95071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:21 compute-0 python3.9[95073]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:49:21 compute-0 sudo[95071]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:21 compute-0 sudo[95222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqwiqhfzsrfuozstulemotgvonbonkyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096561.1375506-1142-79082847184982/AnsiballZ_copy.py'
Nov 25 18:49:21 compute-0 sudo[95222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:21 compute-0 python3.9[95224]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764096561.1375506-1142-79082847184982/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:49:21 compute-0 sudo[95222]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:22 compute-0 sudo[95298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdjmhmnxlrbycettcynhvpwrppiftpej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096561.1375506-1142-79082847184982/AnsiballZ_systemd.py'
Nov 25 18:49:22 compute-0 sudo[95298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:22 compute-0 python3.9[95300]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:49:22 compute-0 systemd[1]: Reloading.
Nov 25 18:49:22 compute-0 systemd-sysv-generator[95330]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:49:22 compute-0 systemd-rc-local-generator[95326]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:49:22 compute-0 sudo[95298]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:23 compute-0 sudo[95408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ranjpknrugqkcvksuruollkwxfuakgnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096561.1375506-1142-79082847184982/AnsiballZ_systemd.py'
Nov 25 18:49:23 compute-0 sudo[95408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:23 compute-0 python3.9[95410]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:49:23 compute-0 systemd[1]: Reloading.
Nov 25 18:49:23 compute-0 systemd-rc-local-generator[95434]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:49:23 compute-0 systemd-sysv-generator[95438]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:49:23 compute-0 systemd[1]: Starting ovn_controller container...
Nov 25 18:49:23 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 25 18:49:23 compute-0 systemd[1]: Started libcrun container.
Nov 25 18:49:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96c020be0e4eb04273741451153c807968728a052587da8f861965d46577dee5/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 25 18:49:23 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84.
Nov 25 18:49:23 compute-0 podman[95450]: 2025-11-25 18:49:23.939549576 +0000 UTC m=+0.175929611 container init 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 18:49:23 compute-0 ovn_controller[95465]: + sudo -E kolla_set_configs
Nov 25 18:49:23 compute-0 podman[95450]: 2025-11-25 18:49:23.980303442 +0000 UTC m=+0.216683457 container start 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 18:49:23 compute-0 edpm-start-podman-container[95450]: ovn_controller
Nov 25 18:49:24 compute-0 systemd[1]: Created slice User Slice of UID 0.
Nov 25 18:49:24 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 25 18:49:24 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 25 18:49:24 compute-0 systemd[1]: Starting User Manager for UID 0...
Nov 25 18:49:24 compute-0 systemd[95499]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 25 18:49:24 compute-0 edpm-start-podman-container[95449]: Creating additional drop-in dependency for "ovn_controller" (8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84)
Nov 25 18:49:24 compute-0 podman[95472]: 2025-11-25 18:49:24.094318717 +0000 UTC m=+0.096290677 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 18:49:24 compute-0 systemd[1]: 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84-6695121f01902617.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 18:49:24 compute-0 systemd[1]: 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84-6695121f01902617.service: Failed with result 'exit-code'.
Nov 25 18:49:24 compute-0 systemd[1]: Reloading.
Nov 25 18:49:24 compute-0 systemd[95499]: Queued start job for default target Main User Target.
Nov 25 18:49:24 compute-0 systemd[95499]: Created slice User Application Slice.
Nov 25 18:49:24 compute-0 systemd[95499]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 25 18:49:24 compute-0 systemd[95499]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 18:49:24 compute-0 systemd[95499]: Reached target Paths.
Nov 25 18:49:24 compute-0 systemd[95499]: Reached target Timers.
Nov 25 18:49:24 compute-0 systemd[95499]: Starting D-Bus User Message Bus Socket...
Nov 25 18:49:24 compute-0 systemd[95499]: Starting Create User's Volatile Files and Directories...
Nov 25 18:49:24 compute-0 systemd[95499]: Finished Create User's Volatile Files and Directories.
Nov 25 18:49:24 compute-0 systemd[95499]: Listening on D-Bus User Message Bus Socket.
Nov 25 18:49:24 compute-0 systemd[95499]: Reached target Sockets.
Nov 25 18:49:24 compute-0 systemd[95499]: Reached target Basic System.
Nov 25 18:49:24 compute-0 systemd[95499]: Reached target Main User Target.
Nov 25 18:49:24 compute-0 systemd[95499]: Startup finished in 127ms.
Nov 25 18:49:24 compute-0 systemd-rc-local-generator[95552]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:49:24 compute-0 systemd-sysv-generator[95555]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:49:24 compute-0 systemd[1]: Started User Manager for UID 0.
Nov 25 18:49:24 compute-0 systemd[1]: Started ovn_controller container.
Nov 25 18:49:24 compute-0 systemd[1]: Started Session c1 of User root.
Nov 25 18:49:24 compute-0 sudo[95408]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:24 compute-0 ovn_controller[95465]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 18:49:24 compute-0 ovn_controller[95465]: INFO:__main__:Validating config file
Nov 25 18:49:24 compute-0 ovn_controller[95465]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 18:49:24 compute-0 ovn_controller[95465]: INFO:__main__:Writing out command to execute
Nov 25 18:49:24 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 25 18:49:24 compute-0 ovn_controller[95465]: ++ cat /run_command
Nov 25 18:49:24 compute-0 ovn_controller[95465]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 25 18:49:24 compute-0 ovn_controller[95465]: + ARGS=
Nov 25 18:49:24 compute-0 ovn_controller[95465]: + sudo kolla_copy_cacerts
Nov 25 18:49:24 compute-0 systemd[1]: Started Session c2 of User root.
Nov 25 18:49:24 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 25 18:49:24 compute-0 ovn_controller[95465]: + [[ ! -n '' ]]
Nov 25 18:49:24 compute-0 ovn_controller[95465]: + . kolla_extend_start
Nov 25 18:49:24 compute-0 ovn_controller[95465]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 25 18:49:24 compute-0 ovn_controller[95465]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 25 18:49:24 compute-0 ovn_controller[95465]: + umask 0022
Nov 25 18:49:24 compute-0 ovn_controller[95465]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Nov 25 18:49:24 compute-0 ovn_controller[95465]: 2025-11-25T18:49:24Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Nov 25 18:49:24 compute-0 NetworkManager[55552]: <info>  [1764096564.5655] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 25 18:49:24 compute-0 NetworkManager[55552]: <info>  [1764096564.5666] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 25 18:49:24 compute-0 NetworkManager[55552]: <info>  [1764096564.5683] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 25 18:49:24 compute-0 NetworkManager[55552]: <info>  [1764096564.5691] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 25 18:49:24 compute-0 NetworkManager[55552]: <info>  [1764096564.5696] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 25 18:49:24 compute-0 kernel: br-int: entered promiscuous mode
Nov 25 18:49:24 compute-0 systemd-udevd[95601]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00025|main|INFO|OVS feature set changed, force recompute.
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00034|features|INFO|OVS Feature: group_support, state: supported
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00035|main|INFO|OVS feature set changed, force recompute.
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 25 18:49:25 compute-0 ovn_controller[95465]: 2025-11-25T18:49:25Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 25 18:49:25 compute-0 NetworkManager[55552]: <info>  [1764096565.7163] manager: (ovn-6b92b0-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 25 18:49:25 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Nov 25 18:49:25 compute-0 NetworkManager[55552]: <info>  [1764096565.7428] device (genev_sys_6081): carrier: link connected
Nov 25 18:49:25 compute-0 NetworkManager[55552]: <info>  [1764096565.7435] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 25 18:49:26 compute-0 sudo[95732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtyjvnunxqokbssdkqcumzalyknjuizp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096565.8009808-1198-163642961047601/AnsiballZ_command.py'
Nov 25 18:49:26 compute-0 sudo[95732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:26 compute-0 python3.9[95734]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:49:26 compute-0 ovs-vsctl[95735]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 25 18:49:26 compute-0 sudo[95732]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:26 compute-0 NetworkManager[55552]: <info>  [1764096566.8171] manager: (ovn-e4cabb-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 25 18:49:27 compute-0 sudo[95886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aguiabnzcsisxsydzsbpszuppeuqbenb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096566.7096896-1214-245652511646444/AnsiballZ_command.py'
Nov 25 18:49:27 compute-0 sudo[95886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:27 compute-0 python3.9[95888]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:49:27 compute-0 ovs-vsctl[95890]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 25 18:49:27 compute-0 sudo[95886]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:28 compute-0 sudo[96041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nemrxuqjakrmjwkcvnwvsfqbujantoaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096567.9404209-1242-109453612724245/AnsiballZ_command.py'
Nov 25 18:49:28 compute-0 sudo[96041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:28 compute-0 python3.9[96043]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:49:28 compute-0 ovs-vsctl[96044]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 25 18:49:28 compute-0 sudo[96041]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:28 compute-0 sshd-session[84970]: Connection closed by 192.168.122.30 port 60336
Nov 25 18:49:28 compute-0 sshd-session[84967]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:49:28 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Nov 25 18:49:28 compute-0 systemd[1]: session-20.scope: Consumed 55.590s CPU time.
Nov 25 18:49:28 compute-0 systemd-logind[820]: Session 20 logged out. Waiting for processes to exit.
Nov 25 18:49:29 compute-0 systemd-logind[820]: Removed session 20.
Nov 25 18:49:34 compute-0 sshd-session[96069]: Accepted publickey for zuul from 192.168.122.30 port 54550 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:49:34 compute-0 systemd-logind[820]: New session 22 of user zuul.
Nov 25 18:49:34 compute-0 systemd[1]: Started Session 22 of User zuul.
Nov 25 18:49:34 compute-0 sshd-session[96069]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:49:34 compute-0 systemd[1]: Stopping User Manager for UID 0...
Nov 25 18:49:34 compute-0 systemd[95499]: Activating special unit Exit the Session...
Nov 25 18:49:34 compute-0 systemd[95499]: Stopped target Main User Target.
Nov 25 18:49:34 compute-0 systemd[95499]: Stopped target Basic System.
Nov 25 18:49:34 compute-0 systemd[95499]: Stopped target Paths.
Nov 25 18:49:34 compute-0 systemd[95499]: Stopped target Sockets.
Nov 25 18:49:34 compute-0 systemd[95499]: Stopped target Timers.
Nov 25 18:49:34 compute-0 systemd[95499]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 18:49:34 compute-0 systemd[95499]: Closed D-Bus User Message Bus Socket.
Nov 25 18:49:34 compute-0 systemd[95499]: Stopped Create User's Volatile Files and Directories.
Nov 25 18:49:34 compute-0 systemd[95499]: Removed slice User Application Slice.
Nov 25 18:49:34 compute-0 systemd[95499]: Reached target Shutdown.
Nov 25 18:49:34 compute-0 systemd[95499]: Finished Exit the Session.
Nov 25 18:49:34 compute-0 systemd[95499]: Reached target Exit the Session.
Nov 25 18:49:34 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Nov 25 18:49:34 compute-0 systemd[1]: Stopped User Manager for UID 0.
Nov 25 18:49:34 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 25 18:49:34 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 25 18:49:34 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 25 18:49:34 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 25 18:49:34 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Nov 25 18:49:34 compute-0 ovn_controller[95465]: 2025-11-25T18:49:34Z|00038|memory|INFO|15856 kB peak resident set size after 10.3 seconds
Nov 25 18:49:34 compute-0 ovn_controller[95465]: 2025-11-25T18:49:34Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Nov 25 18:49:35 compute-0 python3.9[96225]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:49:36 compute-0 sudo[96379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oggbfljlhwakymdyfzgdhkfpojubrics ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096576.0978174-48-25589709270838/AnsiballZ_file.py'
Nov 25 18:49:36 compute-0 sudo[96379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:36 compute-0 python3.9[96381]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:36 compute-0 sudo[96379]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:37 compute-0 sudo[96531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcbbcleckkrowsbxuchaueommuvntjdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096576.9673254-48-115837020507467/AnsiballZ_file.py'
Nov 25 18:49:37 compute-0 sudo[96531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:37 compute-0 python3.9[96533]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:37 compute-0 sudo[96531]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:37 compute-0 sudo[96683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bajyeuxzzdzeimrnrfxcgrspnyhkhzqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096577.6767108-48-148371466263046/AnsiballZ_file.py'
Nov 25 18:49:37 compute-0 sudo[96683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:38 compute-0 python3.9[96685]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:38 compute-0 sudo[96683]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:38 compute-0 sudo[96835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iojhkfyfwoylnalpbiuxahxhyxrqmwcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096578.392188-48-152943363064652/AnsiballZ_file.py'
Nov 25 18:49:38 compute-0 sudo[96835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:39 compute-0 python3.9[96837]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:39 compute-0 sudo[96835]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:39 compute-0 sudo[96987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbljsauikmrjyctffwqvpoatcxujylla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096579.1950526-48-43634920438167/AnsiballZ_file.py'
Nov 25 18:49:39 compute-0 sudo[96987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:39 compute-0 python3.9[96989]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:39 compute-0 sudo[96987]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:40 compute-0 python3.9[97139]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:49:41 compute-0 sudo[97289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbkqzdhjufxfjwabgxswlhuhsfwugryd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096581.1632214-136-68646418553463/AnsiballZ_seboolean.py'
Nov 25 18:49:41 compute-0 sudo[97289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:41 compute-0 python3.9[97291]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 25 18:49:42 compute-0 sudo[97289]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:43 compute-0 python3.9[97441]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:44 compute-0 python3.9[97563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096582.7586107-152-60005395127775/.source follow=False _original_basename=haproxy.j2 checksum=5da0851b5d6d8c67f8f439d0f4fb0625e087d380 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:44 compute-0 python3.9[97713]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:45 compute-0 python3.9[97834]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096584.4697535-182-272704977034365/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:46 compute-0 sudo[97984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtvjjfncukhkqyfyokwglpknphdiddcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096586.018468-216-83505101773457/AnsiballZ_setup.py'
Nov 25 18:49:46 compute-0 sudo[97984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:46 compute-0 python3.9[97986]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:49:47 compute-0 sudo[97984]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:47 compute-0 sudo[98068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biytnhorcectkzlglrgmbbuqppzvdhvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096586.018468-216-83505101773457/AnsiballZ_dnf.py'
Nov 25 18:49:47 compute-0 sudo[98068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:47 compute-0 python3.9[98070]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:49:49 compute-0 sudo[98068]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:49 compute-0 sudo[98221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgxquvwvvyisuhrufhcbqfmahesmbsja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096589.302687-240-85286047048189/AnsiballZ_systemd.py'
Nov 25 18:49:49 compute-0 sudo[98221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:50 compute-0 python3.9[98223]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 18:49:50 compute-0 sudo[98221]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:51 compute-0 python3.9[98376]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:51 compute-0 python3.9[98497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096590.6428285-256-137321769950818/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:52 compute-0 python3.9[98647]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:53 compute-0 python3.9[98768]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096591.825693-256-23994960272310/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:54 compute-0 podman[98892]: 2025-11-25 18:49:54.293783187 +0000 UTC m=+0.107419079 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 18:49:54 compute-0 python3.9[98933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:55 compute-0 python3.9[99065]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096593.914605-344-177229533087609/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:55 compute-0 python3.9[99215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:56 compute-0 python3.9[99336]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096595.196995-344-247467300161468/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:57 compute-0 python3.9[99487]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:49:58 compute-0 sudo[99639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ailrskzkimewygomdxcwzsevdwpsseuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096598.0094035-420-133325958314303/AnsiballZ_file.py'
Nov 25 18:49:58 compute-0 sudo[99639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:58 compute-0 python3.9[99641]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:58 compute-0 sudo[99639]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:59 compute-0 sudo[99791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsmwbraymgdvypcugqxbuzefwnlnjpmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096598.81005-436-82674373164356/AnsiballZ_stat.py'
Nov 25 18:49:59 compute-0 sudo[99791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:59 compute-0 python3.9[99793]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:49:59 compute-0 sudo[99791]: pam_unix(sudo:session): session closed for user root
Nov 25 18:49:59 compute-0 sudo[99869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvsebgvfxuumeoxdggmzcgvxfwzhajrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096598.81005-436-82674373164356/AnsiballZ_file.py'
Nov 25 18:49:59 compute-0 sudo[99869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:49:59 compute-0 python3.9[99871]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:49:59 compute-0 sudo[99869]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:00 compute-0 sudo[100021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpubxykripjdpeblhtffxeaewtebtojj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096600.0564375-436-149556585040086/AnsiballZ_stat.py'
Nov 25 18:50:00 compute-0 sudo[100021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:00 compute-0 python3.9[100023]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:50:00 compute-0 sudo[100021]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:00 compute-0 sudo[100099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-houpxfxnhxejpvbjjptvaylscyfgkxle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096600.0564375-436-149556585040086/AnsiballZ_file.py'
Nov 25 18:50:00 compute-0 sudo[100099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:01 compute-0 python3.9[100101]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:50:01 compute-0 sudo[100099]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:01 compute-0 sudo[100251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftvpduwlzdlljrjqjpyzqkmnqcxpkfba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096601.497617-482-162649610130378/AnsiballZ_file.py'
Nov 25 18:50:01 compute-0 sudo[100251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:02 compute-0 python3.9[100253]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:02 compute-0 sudo[100251]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:02 compute-0 sudo[100403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qarmxjaczorhtckstferbvdeeantrcfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096602.2991529-498-141622205506469/AnsiballZ_stat.py'
Nov 25 18:50:02 compute-0 sudo[100403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:02 compute-0 python3.9[100405]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:50:02 compute-0 sudo[100403]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:03 compute-0 sudo[100481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwfcusohvrihlievejpkxifmiyanfhso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096602.2991529-498-141622205506469/AnsiballZ_file.py'
Nov 25 18:50:03 compute-0 sudo[100481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:03 compute-0 python3.9[100483]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:03 compute-0 sudo[100481]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:04 compute-0 sudo[100633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzzfwuwmhijjzzeokpmrtfqngvlddowh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096603.8054233-522-277536282924044/AnsiballZ_stat.py'
Nov 25 18:50:04 compute-0 sudo[100633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:04 compute-0 python3.9[100635]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:50:04 compute-0 sudo[100633]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:04 compute-0 sudo[100711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhucboppednwvalepdzzggqueevsxwky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096603.8054233-522-277536282924044/AnsiballZ_file.py'
Nov 25 18:50:04 compute-0 sudo[100711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:04 compute-0 python3.9[100713]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:04 compute-0 sudo[100711]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:05 compute-0 sudo[100863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eckrhuohgdivpixkgypcmjcdxknzgwbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096605.2019782-546-60727291972964/AnsiballZ_systemd.py'
Nov 25 18:50:05 compute-0 sudo[100863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:05 compute-0 python3.9[100865]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:05 compute-0 systemd[1]: Reloading.
Nov 25 18:50:06 compute-0 systemd-sysv-generator[100895]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:50:06 compute-0 systemd-rc-local-generator[100890]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:50:06 compute-0 sudo[100863]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:06 compute-0 sudo[101053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwqxuujoqzemlbawezkfyypyvkgjuewm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096606.554965-562-12600892683823/AnsiballZ_stat.py'
Nov 25 18:50:06 compute-0 sudo[101053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:07 compute-0 python3.9[101055]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:50:07 compute-0 sudo[101053]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:07 compute-0 sudo[101131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxijmlhttskacgseeczhuwbvszldyfsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096606.554965-562-12600892683823/AnsiballZ_file.py'
Nov 25 18:50:07 compute-0 sudo[101131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:07 compute-0 python3.9[101133]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:07 compute-0 sudo[101131]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:08 compute-0 sudo[101283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enbyyasjzylvqhncvdxkiywzqsyqvfrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096608.1108015-586-161830399790205/AnsiballZ_stat.py'
Nov 25 18:50:08 compute-0 sudo[101283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:08 compute-0 python3.9[101285]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:50:08 compute-0 sudo[101283]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:09 compute-0 sudo[101361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozivjeuojfjonqtuxelzxyzsfxdyfkrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096608.1108015-586-161830399790205/AnsiballZ_file.py'
Nov 25 18:50:09 compute-0 sudo[101361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:09 compute-0 python3.9[101363]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:09 compute-0 sudo[101361]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:09 compute-0 sudo[101513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woquwcrjjhvubndncmdrxcbkixzpwttd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096609.5460308-610-20324721452126/AnsiballZ_systemd.py'
Nov 25 18:50:09 compute-0 sudo[101513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:10 compute-0 python3.9[101515]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:10 compute-0 systemd[1]: Reloading.
Nov 25 18:50:10 compute-0 systemd-sysv-generator[101545]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:50:10 compute-0 systemd-rc-local-generator[101538]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:50:10 compute-0 systemd[1]: Starting Create netns directory...
Nov 25 18:50:10 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 18:50:10 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 18:50:10 compute-0 systemd[1]: Finished Create netns directory.
Nov 25 18:50:10 compute-0 sudo[101513]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:11 compute-0 sudo[101705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlcnkpurujflbwmihyjqvffrywfuvlog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096610.998444-630-18344282204743/AnsiballZ_file.py'
Nov 25 18:50:11 compute-0 sudo[101705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:11 compute-0 python3.9[101707]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:50:11 compute-0 sudo[101705]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:12 compute-0 sudo[101857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agusfayrkcnhsowmpkqusebfgepectst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096611.815093-646-278742881249657/AnsiballZ_stat.py'
Nov 25 18:50:12 compute-0 sudo[101857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:12 compute-0 python3.9[101859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:50:12 compute-0 sudo[101857]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:12 compute-0 sudo[101980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzbncxpfmxavcsgcpuuxbcpmimnehewt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096611.815093-646-278742881249657/AnsiballZ_copy.py'
Nov 25 18:50:12 compute-0 sudo[101980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:13 compute-0 python3.9[101982]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764096611.815093-646-278742881249657/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:50:13 compute-0 sudo[101980]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:13 compute-0 sudo[102132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhdgqqodizlxxleiafhwgklqpbbomirr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096613.569557-680-188798477639103/AnsiballZ_file.py'
Nov 25 18:50:13 compute-0 sudo[102132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:14 compute-0 python3.9[102134]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:50:14 compute-0 sudo[102132]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:14 compute-0 sudo[102284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpuotckwqwvsfanoadzpwpewbehwzsmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096614.4661214-696-120847585382275/AnsiballZ_stat.py'
Nov 25 18:50:14 compute-0 sudo[102284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:15 compute-0 python3.9[102286]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:50:15 compute-0 sudo[102284]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:15 compute-0 sudo[102407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfgxgcxpjkjepdnltrynhjchvmlmnkig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096614.4661214-696-120847585382275/AnsiballZ_copy.py'
Nov 25 18:50:15 compute-0 sudo[102407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:15 compute-0 python3.9[102409]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096614.4661214-696-120847585382275/.source.json _original_basename=.n7myhba5 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:15 compute-0 sudo[102407]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:16 compute-0 sudo[102559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzfurzkzyymtvsyrljnipaloflnxraou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096615.9461224-726-273557590968118/AnsiballZ_file.py'
Nov 25 18:50:16 compute-0 sudo[102559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:16 compute-0 python3.9[102561]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:16 compute-0 sudo[102559]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:17 compute-0 sudo[102711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiwywpfpyqnarpuyxvaqlvrlbuzgzfiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096616.7869925-742-88131463180844/AnsiballZ_stat.py'
Nov 25 18:50:17 compute-0 sudo[102711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:17 compute-0 sudo[102711]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:17 compute-0 sudo[102834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjwkqzyzrlfufbtcihzyctdljxvkzzrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096616.7869925-742-88131463180844/AnsiballZ_copy.py'
Nov 25 18:50:17 compute-0 sudo[102834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:17 compute-0 sudo[102834]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:19 compute-0 sudo[102986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abmeozhwuncxvyiklqdksmikhyuhrnoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096618.7064142-776-230280843823216/AnsiballZ_container_config_data.py'
Nov 25 18:50:19 compute-0 sudo[102986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:19 compute-0 python3.9[102988]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 25 18:50:19 compute-0 sudo[102986]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:20 compute-0 sudo[103138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjgyecpkqodihzetvxlnojkyyjgenagu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096619.6880317-794-224082306194490/AnsiballZ_container_config_hash.py'
Nov 25 18:50:20 compute-0 sudo[103138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:20 compute-0 python3.9[103140]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 18:50:20 compute-0 sudo[103138]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:21 compute-0 sudo[103290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlnunbsvuftuufskhehftwqwvrbxczkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096620.7637112-812-79177035745422/AnsiballZ_podman_container_info.py'
Nov 25 18:50:21 compute-0 sudo[103290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:21 compute-0 python3.9[103292]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 25 18:50:21 compute-0 sudo[103290]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:22 compute-0 sudo[103468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gffxvcijostzhqhrhbsowqxnudhuobqf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764096622.3508384-838-44520611260551/AnsiballZ_edpm_container_manage.py'
Nov 25 18:50:22 compute-0 sudo[103468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:23 compute-0 python3[103470]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 18:50:23 compute-0 podman[103508]: 2025-11-25 18:50:23.441593857 +0000 UTC m=+0.024237271 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 18:50:23 compute-0 podman[103508]: 2025-11-25 18:50:23.674348717 +0000 UTC m=+0.256992081 container create 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 18:50:23 compute-0 python3[103470]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 18:50:23 compute-0 sudo[103468]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:24 compute-0 sudo[103708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvsranxqqvhzhzgtsqhimdgrsdlmiwbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096624.0970778-854-135417518920276/AnsiballZ_stat.py'
Nov 25 18:50:24 compute-0 sudo[103708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:24 compute-0 podman[103670]: 2025-11-25 18:50:24.51355637 +0000 UTC m=+0.127584755 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Nov 25 18:50:24 compute-0 python3.9[103719]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:50:24 compute-0 sudo[103708]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:25 compute-0 sudo[103878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loemizpdkkzsjnjigksmqnkvugolgjpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096625.103275-872-188461723334177/AnsiballZ_file.py'
Nov 25 18:50:25 compute-0 sudo[103878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:25 compute-0 python3.9[103880]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:25 compute-0 sudo[103878]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:26 compute-0 sudo[103954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbywmqzawvdsbiakukjcqdmvlpmwxfxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096625.103275-872-188461723334177/AnsiballZ_stat.py'
Nov 25 18:50:26 compute-0 sudo[103954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:26 compute-0 python3.9[103956]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:50:26 compute-0 sudo[103954]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:26 compute-0 sudo[104105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uswwxprdehcrdhgiponjbokekgbmafbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096626.3175898-872-24673514587386/AnsiballZ_copy.py'
Nov 25 18:50:26 compute-0 sudo[104105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:27 compute-0 python3.9[104107]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764096626.3175898-872-24673514587386/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:27 compute-0 sudo[104105]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:27 compute-0 sudo[104181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocrkuzibunwxzuxmwydcwmlrfadeowex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096626.3175898-872-24673514587386/AnsiballZ_systemd.py'
Nov 25 18:50:27 compute-0 sudo[104181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:27 compute-0 python3.9[104183]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:50:27 compute-0 systemd[1]: Reloading.
Nov 25 18:50:27 compute-0 systemd-rc-local-generator[104213]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:50:27 compute-0 systemd-sysv-generator[104216]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:50:27 compute-0 sudo[104181]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:28 compute-0 sudo[104294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwpwpsredjtleklfuzrfnjlztvaaxjte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096626.3175898-872-24673514587386/AnsiballZ_systemd.py'
Nov 25 18:50:28 compute-0 sudo[104294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:28 compute-0 python3.9[104296]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:28 compute-0 systemd[1]: Reloading.
Nov 25 18:50:28 compute-0 systemd-rc-local-generator[104326]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:50:28 compute-0 systemd-sysv-generator[104330]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:50:28 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Nov 25 18:50:29 compute-0 systemd[1]: Started libcrun container.
Nov 25 18:50:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a46bdea3b7585d5739fc9341629684ab7280a5ec7ae2afb6ce3e47bd2944ebe/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 25 18:50:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a46bdea3b7585d5739fc9341629684ab7280a5ec7ae2afb6ce3e47bd2944ebe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 18:50:29 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec.
Nov 25 18:50:29 compute-0 podman[104336]: 2025-11-25 18:50:29.100256099 +0000 UTC m=+0.160900716 container init 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: + sudo -E kolla_set_configs
Nov 25 18:50:29 compute-0 podman[104336]: 2025-11-25 18:50:29.144788249 +0000 UTC m=+0.205432846 container start 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 18:50:29 compute-0 edpm-start-podman-container[104336]: ovn_metadata_agent
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Validating config file
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Copying service configuration files
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Writing out command to execute
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: ++ cat /run_command
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: + CMD=neutron-ovn-metadata-agent
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: + ARGS=
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: + sudo kolla_copy_cacerts
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: + [[ ! -n '' ]]
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: + . kolla_extend_start
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: Running command: 'neutron-ovn-metadata-agent'
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: + umask 0022
Nov 25 18:50:29 compute-0 ovn_metadata_agent[104351]: + exec neutron-ovn-metadata-agent
Nov 25 18:50:29 compute-0 podman[104358]: 2025-11-25 18:50:29.246125449 +0000 UTC m=+0.082182352 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 25 18:50:29 compute-0 edpm-start-podman-container[104335]: Creating additional drop-in dependency for "ovn_metadata_agent" (954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec)
Nov 25 18:50:29 compute-0 systemd[1]: Reloading.
Nov 25 18:50:29 compute-0 systemd-sysv-generator[104434]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:50:29 compute-0 systemd-rc-local-generator[104431]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:50:29 compute-0 systemd[1]: Started ovn_metadata_agent container.
Nov 25 18:50:29 compute-0 sudo[104294]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:30 compute-0 sshd-session[96072]: Connection closed by 192.168.122.30 port 54550
Nov 25 18:50:30 compute-0 sshd-session[96069]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:50:30 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Nov 25 18:50:30 compute-0 systemd[1]: session-22.scope: Consumed 39.873s CPU time.
Nov 25 18:50:30 compute-0 systemd-logind[820]: Session 22 logged out. Waiting for processes to exit.
Nov 25 18:50:30 compute-0 systemd-logind[820]: Removed session 22.
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.992 104356 INFO neutron.common.config [-] Logging enabled!
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.992 104356 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.993 104356 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.993 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.993 104356 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.993 104356 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.994 104356 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.995 104356 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.996 104356 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.997 104356 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.998 104356 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.177 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:30.999 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.000 104356 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.001 104356 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.002 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.003 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.004 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.005 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.006 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.007 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.008 104356 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.009 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.010 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.011 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.012 104356 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.013 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.014 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.015 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.016 104356 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.017 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.018 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.019 104356 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.020 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.021 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.022 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.023 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.023 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.023 104356 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.023 104356 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.031 104356 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.031 104356 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.031 104356 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.031 104356 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.032 104356 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.041 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 942ca545-427a-4223-ba58-570f588d0469 (UUID: 942ca545-427a-4223-ba58-570f588d0469) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.070 104356 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.070 104356 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.070 104356 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.070 104356 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.070 104356 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.073 104356 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.078 104356 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.085 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '942ca545-427a-4223-ba58-570f588d0469'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], external_ids={}, name=942ca545-427a-4223-ba58-570f588d0469, nb_cfg_timestamp=1764096573597, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.087 104356 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpj4jxwzwu/privsep.sock']
Nov 25 18:50:31 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.845 104356 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.846 104356 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpj4jxwzwu/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.701 104475 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.705 104475 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.707 104475 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.707 104475 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104475
Nov 25 18:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:31.848 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[2a397650-cf58-4efe-b344-2cc0d9d0ff13]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 18:50:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:32.259 104475 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:50:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:32.259 104475 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:50:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:32.259 104475 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:50:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:32.702 104475 INFO oslo_service.backend [-] Loading backend: eventlet
Nov 25 18:50:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:32.707 104475 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Nov 25 18:50:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:32.743 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[049f5921-4b8d-435f-956a-02e11fe9d346]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 18:50:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:32.744 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, column=external_ids, values=({'neutron:ovn-metadata-id': '339d3e93-7dbc-5f4e-87df-deaf74a1440d'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 18:50:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:32.804 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 18:50:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:50:32.811 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 18:50:35 compute-0 sshd-session[104480]: Accepted publickey for zuul from 192.168.122.30 port 37754 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:50:35 compute-0 systemd-logind[820]: New session 23 of user zuul.
Nov 25 18:50:35 compute-0 systemd[1]: Started Session 23 of User zuul.
Nov 25 18:50:35 compute-0 sshd-session[104480]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:50:37 compute-0 python3.9[104633]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:50:38 compute-0 sudo[104787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-talukkikojvvzxiwkufoktwbbckjyuxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096637.8349044-48-241602925879150/AnsiballZ_command.py'
Nov 25 18:50:38 compute-0 sudo[104787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:38 compute-0 python3.9[104789]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:50:38 compute-0 sudo[104787]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:39 compute-0 sudo[104952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxlriiocvejbdnkopzqsjafbtvmvenbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096639.1849957-70-48855520887354/AnsiballZ_systemd_service.py'
Nov 25 18:50:39 compute-0 sudo[104952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:40 compute-0 python3.9[104954]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:50:40 compute-0 systemd[1]: Reloading.
Nov 25 18:50:40 compute-0 systemd-rc-local-generator[104983]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:50:40 compute-0 systemd-sysv-generator[104987]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:50:40 compute-0 sudo[104952]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:41 compute-0 python3.9[105140]: ansible-ansible.builtin.service_facts Invoked
Nov 25 18:50:41 compute-0 network[105157]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 18:50:41 compute-0 network[105158]: 'network-scripts' will be removed from distribution in near future.
Nov 25 18:50:41 compute-0 network[105159]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 18:50:46 compute-0 sudo[105418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvynvyvobunvkknfbhxbtrpfhsvffjca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096645.7423081-108-190443528253488/AnsiballZ_systemd_service.py'
Nov 25 18:50:46 compute-0 sudo[105418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:46 compute-0 python3.9[105420]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:46 compute-0 sudo[105418]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:47 compute-0 sudo[105571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rosnejwpafxpjlzzwwygccssjujehgyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096646.7629411-108-273110972658099/AnsiballZ_systemd_service.py'
Nov 25 18:50:47 compute-0 sudo[105571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:47 compute-0 python3.9[105573]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:47 compute-0 sudo[105571]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:48 compute-0 sudo[105724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlvodbvudlapluywqrgtgolhamvkigmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096647.7633562-108-249159663355591/AnsiballZ_systemd_service.py'
Nov 25 18:50:48 compute-0 sudo[105724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:48 compute-0 python3.9[105726]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:48 compute-0 sudo[105724]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:49 compute-0 sudo[105877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtvuimsjfnouwrpvjkxazsbhvepofzgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096648.656279-108-180861975404236/AnsiballZ_systemd_service.py'
Nov 25 18:50:49 compute-0 sudo[105877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:49 compute-0 python3.9[105879]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:49 compute-0 sudo[105877]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:50 compute-0 sudo[106030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpwkvctztsgtlncvrekxwfxiggyontvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096649.9732141-108-172276007760784/AnsiballZ_systemd_service.py'
Nov 25 18:50:50 compute-0 sudo[106030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:50 compute-0 python3.9[106032]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:50 compute-0 sudo[106030]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:51 compute-0 sudo[106183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rklqyybhzsfytspewhdjawbnonwtdcwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096650.891565-108-147861104501554/AnsiballZ_systemd_service.py'
Nov 25 18:50:51 compute-0 sudo[106183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:51 compute-0 python3.9[106185]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:52 compute-0 sudo[106183]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:53 compute-0 sudo[106336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkfbbrqfxrmbfynptkrteyyajkwehagm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096652.8260398-108-66770650977398/AnsiballZ_systemd_service.py'
Nov 25 18:50:53 compute-0 sudo[106336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:53 compute-0 python3.9[106338]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:50:53 compute-0 sudo[106336]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:54 compute-0 sudo[106489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsfopocrtedthlhxhxqlozwnzxzdapts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096654.026053-212-270034756902917/AnsiballZ_file.py'
Nov 25 18:50:54 compute-0 sudo[106489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:54 compute-0 podman[106491]: 2025-11-25 18:50:54.75540237 +0000 UTC m=+0.147617629 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 18:50:54 compute-0 python3.9[106492]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:54 compute-0 sudo[106489]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:55 compute-0 sudo[106668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvetfzlmmkmciipfktsecjsbitssuylo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096655.0203323-212-258229758884717/AnsiballZ_file.py'
Nov 25 18:50:55 compute-0 sudo[106668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:55 compute-0 python3.9[106670]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:55 compute-0 sudo[106668]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:56 compute-0 sudo[106820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmxyymfzawlnqzvqcghuoertdpbluhce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096655.8259916-212-137596305227419/AnsiballZ_file.py'
Nov 25 18:50:56 compute-0 sudo[106820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:56 compute-0 python3.9[106822]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:56 compute-0 sudo[106820]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:56 compute-0 sudo[106972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uavqxmlffsehvndmndidposlrphemouf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096656.5784266-212-160642891899807/AnsiballZ_file.py'
Nov 25 18:50:56 compute-0 sudo[106972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:57 compute-0 python3.9[106974]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:57 compute-0 sudo[106972]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:57 compute-0 sudo[107125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwewxvqyvydbcqquptehxmdvdvnswfao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096657.3851302-212-149898785332198/AnsiballZ_file.py'
Nov 25 18:50:57 compute-0 sudo[107125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:58 compute-0 python3.9[107127]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:58 compute-0 sudo[107125]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:58 compute-0 sudo[107277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjssglwbhjranryivccebwuvsikkhtlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096658.2064793-212-164357849115410/AnsiballZ_file.py'
Nov 25 18:50:58 compute-0 sudo[107277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:58 compute-0 python3.9[107279]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:58 compute-0 sudo[107277]: pam_unix(sudo:session): session closed for user root
Nov 25 18:50:59 compute-0 sudo[107438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-govidynmgxpvmzkigovvkwcnamrulqez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096658.992252-212-276527890406522/AnsiballZ_file.py'
Nov 25 18:50:59 compute-0 sudo[107438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:50:59 compute-0 podman[107403]: 2025-11-25 18:50:59.389779477 +0000 UTC m=+0.081219839 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 18:50:59 compute-0 python3.9[107450]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:50:59 compute-0 sudo[107438]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:00 compute-0 sudo[107600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwggpluwvzzaszwojzptusfwjpeehyvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096659.9809945-312-236509928887935/AnsiballZ_file.py'
Nov 25 18:51:00 compute-0 sudo[107600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:00 compute-0 python3.9[107602]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:51:00 compute-0 sudo[107600]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:01 compute-0 sudo[107752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjvkvbaxscmqenbzvwfznzzpmedbfjlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096660.772939-312-159989557954779/AnsiballZ_file.py'
Nov 25 18:51:01 compute-0 sudo[107752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:01 compute-0 python3.9[107754]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:51:01 compute-0 sudo[107752]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:01 compute-0 sudo[107904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbctdgvxjylmkvfjjbhozttbbmbaltaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096661.5617814-312-113372768496048/AnsiballZ_file.py'
Nov 25 18:51:01 compute-0 sudo[107904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:02 compute-0 python3.9[107906]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:51:02 compute-0 sudo[107904]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:02 compute-0 sudo[108057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyitvqzyvhyituvqqjvgditzjlusehvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096662.4119573-312-96446262315069/AnsiballZ_file.py'
Nov 25 18:51:02 compute-0 sudo[108057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:02 compute-0 python3.9[108059]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:51:02 compute-0 sudo[108057]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:03 compute-0 sudo[108209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnzoyoyeeumxmuhizzfzxktmrvxocykm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096663.1513686-312-100173079255341/AnsiballZ_file.py'
Nov 25 18:51:03 compute-0 sudo[108209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:03 compute-0 python3.9[108211]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:51:03 compute-0 sudo[108209]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:04 compute-0 sudo[108361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhgayqpcrzgyjpsxxprubbymuvqjwnjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096663.9045115-312-272608173287587/AnsiballZ_file.py'
Nov 25 18:51:04 compute-0 sudo[108361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:04 compute-0 python3.9[108363]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:51:04 compute-0 sudo[108361]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:05 compute-0 sudo[108513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkfnereqctbjurjgpreqefnezxtirkqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096664.6398253-312-236920572366829/AnsiballZ_file.py'
Nov 25 18:51:05 compute-0 sudo[108513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:05 compute-0 python3.9[108515]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:51:05 compute-0 sudo[108513]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:05 compute-0 sudo[108665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlovetrlolzrivmjtpfsufpvwqodpain ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096665.5685039-414-159532133301540/AnsiballZ_command.py'
Nov 25 18:51:05 compute-0 sudo[108665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:06 compute-0 python3.9[108667]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:51:06 compute-0 sudo[108665]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:07 compute-0 python3.9[108819]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 18:51:08 compute-0 sudo[108969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wivuwcfvjaebpcfipedoqpzkeqdmsnik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096667.6752393-450-170346570823796/AnsiballZ_systemd_service.py'
Nov 25 18:51:08 compute-0 sudo[108969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:08 compute-0 python3.9[108971]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:51:08 compute-0 systemd[1]: Reloading.
Nov 25 18:51:08 compute-0 systemd-rc-local-generator[109000]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:51:08 compute-0 systemd-sysv-generator[109003]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:51:08 compute-0 sudo[108969]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:09 compute-0 sudo[109157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enkugzqgdhbqwqngngupaamcueayropx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096669.008017-466-125314556423401/AnsiballZ_command.py'
Nov 25 18:51:09 compute-0 sudo[109157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:09 compute-0 python3.9[109159]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:51:09 compute-0 sudo[109157]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:10 compute-0 sudo[109310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jywllfubtftoeawyzmkvdnvleolidlkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096669.882699-466-9078903197784/AnsiballZ_command.py'
Nov 25 18:51:10 compute-0 sudo[109310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:10 compute-0 python3.9[109312]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:51:10 compute-0 sudo[109310]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:10 compute-0 sudo[109463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjzxvwbtdcdwmuedzujjmzqybsjteiwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096670.6032188-466-188708176507332/AnsiballZ_command.py'
Nov 25 18:51:10 compute-0 sudo[109463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:11 compute-0 python3.9[109465]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:51:11 compute-0 sudo[109463]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:11 compute-0 sudo[109616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukdhesxwcjzofucodpkdalerzfwybkqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096671.3758647-466-9458389365882/AnsiballZ_command.py'
Nov 25 18:51:11 compute-0 sudo[109616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:11 compute-0 python3.9[109618]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:51:11 compute-0 sudo[109616]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:12 compute-0 sudo[109769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luwpebktdywlypqpgktyjkouixzpzhft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096672.1142023-466-44137852263862/AnsiballZ_command.py'
Nov 25 18:51:12 compute-0 sudo[109769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:12 compute-0 python3.9[109771]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:51:12 compute-0 sudo[109769]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:13 compute-0 sudo[109922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgkkmvxagildydqgpntqqsckcpgnyxuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096672.8366675-466-223904587700225/AnsiballZ_command.py'
Nov 25 18:51:13 compute-0 sudo[109922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:13 compute-0 python3.9[109924]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:51:13 compute-0 sudo[109922]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:14 compute-0 sudo[110075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqtwjzipxbwexgsencapktymiaqmpjfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096673.8882637-466-42888007006859/AnsiballZ_command.py'
Nov 25 18:51:14 compute-0 sudo[110075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:14 compute-0 python3.9[110077]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:51:14 compute-0 sudo[110075]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:15 compute-0 sudo[110228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niqsdymowreihzluyhhhfcvqiskyilsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096674.978583-574-134718601046281/AnsiballZ_getent.py'
Nov 25 18:51:15 compute-0 sudo[110228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:15 compute-0 python3.9[110230]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 25 18:51:15 compute-0 sudo[110228]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:16 compute-0 sudo[110381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leaotmaacusghacotmkfbngekmifnowo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096675.9177356-590-153499452993860/AnsiballZ_group.py'
Nov 25 18:51:16 compute-0 sudo[110381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:16 compute-0 python3.9[110383]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 18:51:16 compute-0 groupadd[110384]: group added to /etc/group: name=libvirt, GID=42473
Nov 25 18:51:16 compute-0 groupadd[110384]: group added to /etc/gshadow: name=libvirt
Nov 25 18:51:16 compute-0 groupadd[110384]: new group: name=libvirt, GID=42473
Nov 25 18:51:16 compute-0 sudo[110381]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:17 compute-0 sudo[110539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grtwufdnajcqewdkgsryzbzdqydwawla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096677.0710125-606-275426680533150/AnsiballZ_user.py'
Nov 25 18:51:17 compute-0 sudo[110539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:17 compute-0 python3.9[110541]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 18:51:18 compute-0 useradd[110543]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 25 18:51:18 compute-0 sudo[110539]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:19 compute-0 sudo[110699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lclsypmtbnvtmzoxncoqebitfytlkunk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096678.6887403-628-251924124726716/AnsiballZ_setup.py'
Nov 25 18:51:19 compute-0 sudo[110699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:19 compute-0 python3.9[110701]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:51:19 compute-0 sudo[110699]: pam_unix(sudo:session): session closed for user root
Nov 25 18:51:20 compute-0 sudo[110783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytxicyhlywndwdhrrtpfxmdnkxzfmehx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096678.6887403-628-251924124726716/AnsiballZ_dnf.py'
Nov 25 18:51:20 compute-0 sudo[110783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:51:20 compute-0 python3.9[110785]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:51:25 compute-0 podman[110796]: 2025-11-25 18:51:25.252938385 +0000 UTC m=+0.162651194 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Nov 25 18:51:30 compute-0 podman[110824]: 2025-11-25 18:51:30.164931348 +0000 UTC m=+0.086670529 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 18:51:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:51:31.024 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:51:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:51:31.025 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:51:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:51:31.025 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:51:52 compute-0 kernel: SELinux:  Converting 2759 SID table entries...
Nov 25 18:51:52 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:51:52 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 18:51:52 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:51:52 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:51:52 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:51:52 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:51:52 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:51:56 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 25 18:51:56 compute-0 podman[111033]: 2025-11-25 18:51:56.255042467 +0000 UTC m=+0.166987312 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 18:52:01 compute-0 podman[111061]: 2025-11-25 18:52:01.211895628 +0000 UTC m=+0.122356316 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Nov 25 18:52:01 compute-0 kernel: SELinux:  Converting 2759 SID table entries...
Nov 25 18:52:01 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:52:01 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 18:52:01 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:52:01 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:52:01 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:52:01 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:52:01 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:52:27 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 25 18:52:27 compute-0 podman[117781]: 2025-11-25 18:52:27.210685454 +0000 UTC m=+0.107499110 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 18:52:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:52:31.026 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:52:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:52:31.026 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:52:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:52:31.026 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:52:32 compute-0 podman[120048]: 2025-11-25 18:52:32.165581162 +0000 UTC m=+0.085360007 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 18:52:58 compute-0 podman[127937]: 2025-11-25 18:52:58.229494764 +0000 UTC m=+0.132487361 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Nov 25 18:53:00 compute-0 sshd-session[127964]: Received disconnect from 150.95.85.24 port 60594:11:  [preauth]
Nov 25 18:53:00 compute-0 sshd-session[127964]: Disconnected from authenticating user root 150.95.85.24 port 60594 [preauth]
Nov 25 18:53:03 compute-0 podman[127970]: 2025-11-25 18:53:03.168354022 +0000 UTC m=+0.080785476 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 18:53:03 compute-0 kernel: SELinux:  Converting 2760 SID table entries...
Nov 25 18:53:03 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 25 18:53:03 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 25 18:53:03 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 25 18:53:03 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 25 18:53:03 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 25 18:53:03 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 25 18:53:03 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 25 18:53:05 compute-0 groupadd[127998]: group added to /etc/group: name=dnsmasq, GID=992
Nov 25 18:53:05 compute-0 groupadd[127998]: group added to /etc/gshadow: name=dnsmasq
Nov 25 18:53:05 compute-0 groupadd[127998]: new group: name=dnsmasq, GID=992
Nov 25 18:53:05 compute-0 useradd[128005]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 25 18:53:05 compute-0 dbus-broker-launch[791]: Noticed file-system modification, trigger reload.
Nov 25 18:53:05 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 25 18:53:05 compute-0 dbus-broker-launch[791]: Noticed file-system modification, trigger reload.
Nov 25 18:53:06 compute-0 groupadd[128018]: group added to /etc/group: name=clevis, GID=991
Nov 25 18:53:06 compute-0 groupadd[128018]: group added to /etc/gshadow: name=clevis
Nov 25 18:53:06 compute-0 groupadd[128018]: new group: name=clevis, GID=991
Nov 25 18:53:06 compute-0 useradd[128025]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 25 18:53:06 compute-0 usermod[128035]: add 'clevis' to group 'tss'
Nov 25 18:53:06 compute-0 usermod[128035]: add 'clevis' to shadow group 'tss'
Nov 25 18:53:10 compute-0 polkitd[43621]: Reloading rules
Nov 25 18:53:10 compute-0 polkitd[43621]: Collecting garbage unconditionally...
Nov 25 18:53:10 compute-0 polkitd[43621]: Loading rules from directory /etc/polkit-1/rules.d
Nov 25 18:53:10 compute-0 polkitd[43621]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 25 18:53:10 compute-0 polkitd[43621]: Finished loading, compiling and executing 3 rules
Nov 25 18:53:10 compute-0 polkitd[43621]: Reloading rules
Nov 25 18:53:10 compute-0 polkitd[43621]: Collecting garbage unconditionally...
Nov 25 18:53:10 compute-0 polkitd[43621]: Loading rules from directory /etc/polkit-1/rules.d
Nov 25 18:53:10 compute-0 polkitd[43621]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 25 18:53:10 compute-0 polkitd[43621]: Finished loading, compiling and executing 3 rules
Nov 25 18:53:11 compute-0 groupadd[128222]: group added to /etc/group: name=ceph, GID=167
Nov 25 18:53:11 compute-0 groupadd[128222]: group added to /etc/gshadow: name=ceph
Nov 25 18:53:11 compute-0 groupadd[128222]: new group: name=ceph, GID=167
Nov 25 18:53:11 compute-0 useradd[128228]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 25 18:53:14 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Nov 25 18:53:14 compute-0 sshd[1009]: Received signal 15; terminating.
Nov 25 18:53:14 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Nov 25 18:53:14 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Nov 25 18:53:14 compute-0 systemd[1]: sshd.service: Consumed 1.900s CPU time, read 32.0K from disk, written 0B to disk.
Nov 25 18:53:14 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Nov 25 18:53:14 compute-0 systemd[1]: Stopping sshd-keygen.target...
Nov 25 18:53:14 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 18:53:14 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 18:53:14 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 25 18:53:14 compute-0 systemd[1]: Reached target sshd-keygen.target.
Nov 25 18:53:14 compute-0 systemd[1]: Starting OpenSSH server daemon...
Nov 25 18:53:14 compute-0 sshd[128747]: Server listening on 0.0.0.0 port 22.
Nov 25 18:53:14 compute-0 sshd[128747]: Server listening on :: port 22.
Nov 25 18:53:14 compute-0 systemd[1]: Started OpenSSH server daemon.
Nov 25 18:53:17 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 18:53:17 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 18:53:17 compute-0 systemd[1]: Reloading.
Nov 25 18:53:18 compute-0 systemd-rc-local-generator[129003]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:18 compute-0 systemd-sysv-generator[129007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:18 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 18:53:23 compute-0 sudo[110783]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:24 compute-0 sudo[133798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgirgwsnwvfqiragatghqhohjfbowxop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096803.5037713-652-121600311808483/AnsiballZ_systemd.py'
Nov 25 18:53:24 compute-0 sudo[133798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:24 compute-0 python3.9[133823]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 18:53:24 compute-0 systemd[1]: Reloading.
Nov 25 18:53:24 compute-0 systemd-sysv-generator[134203]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:24 compute-0 systemd-rc-local-generator[134199]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:24 compute-0 sudo[133798]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:25 compute-0 sudo[135019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuxfsfhhbbfolwbbudugufvznerbrxgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096805.1405313-652-50523647368489/AnsiballZ_systemd.py'
Nov 25 18:53:25 compute-0 sudo[135019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:25 compute-0 python3.9[135045]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 18:53:25 compute-0 systemd[1]: Reloading.
Nov 25 18:53:26 compute-0 systemd-sysv-generator[135443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:26 compute-0 systemd-rc-local-generator[135438]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:26 compute-0 sudo[135019]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:26 compute-0 sudo[136129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfmqwvlfqgihfbyropgifecfshpptvoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096806.4506674-652-204975135909692/AnsiballZ_systemd.py'
Nov 25 18:53:26 compute-0 sudo[136129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:27 compute-0 python3.9[136158]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 18:53:27 compute-0 systemd[1]: Reloading.
Nov 25 18:53:27 compute-0 systemd-rc-local-generator[136569]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:27 compute-0 systemd-sysv-generator[136574]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:27 compute-0 sudo[136129]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:28 compute-0 sudo[137290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jflcxugbmrcuyxrhezlmrcmlejsikyxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096807.6941059-652-239922129169404/AnsiballZ_systemd.py'
Nov 25 18:53:28 compute-0 sudo[137290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:28 compute-0 python3.9[137292]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 18:53:28 compute-0 systemd[1]: Reloading.
Nov 25 18:53:28 compute-0 systemd-sysv-generator[137586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:28 compute-0 systemd-rc-local-generator[137582]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:28 compute-0 podman[137419]: 2025-11-25 18:53:28.723046017 +0000 UTC m=+0.177226818 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 18:53:28 compute-0 sudo[137290]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:29 compute-0 sudo[138315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqrxriaxcnfqnvohlwctxklmjzszryuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096809.211847-710-56120976152257/AnsiballZ_systemd.py'
Nov 25 18:53:29 compute-0 sudo[138315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:29 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 18:53:29 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 18:53:29 compute-0 systemd[1]: man-db-cache-update.service: Consumed 14.832s CPU time.
Nov 25 18:53:29 compute-0 systemd[1]: run-r6f9825c825b3442b8170d4805ce2340c.service: Deactivated successfully.
Nov 25 18:53:29 compute-0 python3.9[138324]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:29 compute-0 systemd[1]: Reloading.
Nov 25 18:53:30 compute-0 systemd-rc-local-generator[138354]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:30 compute-0 systemd-sysv-generator[138357]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:30 compute-0 sudo[138315]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:30 compute-0 sudo[138512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmdxilezlslwcxfiuyybwwymgcdqulwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096810.4311337-710-208559410982686/AnsiballZ_systemd.py'
Nov 25 18:53:30 compute-0 sudo[138512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:53:31.028 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:53:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:53:31.030 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:53:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:53:31.030 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:53:31 compute-0 python3.9[138514]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:31 compute-0 systemd[1]: Reloading.
Nov 25 18:53:31 compute-0 systemd-rc-local-generator[138542]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:31 compute-0 systemd-sysv-generator[138549]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:31 compute-0 sudo[138512]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:32 compute-0 sudo[138703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkxvygsfoxuynotpklqynpkvrhrmwilf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096811.747927-710-174696252396422/AnsiballZ_systemd.py'
Nov 25 18:53:32 compute-0 sudo[138703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:32 compute-0 python3.9[138705]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:32 compute-0 systemd[1]: Reloading.
Nov 25 18:53:32 compute-0 systemd-rc-local-generator[138736]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:32 compute-0 systemd-sysv-generator[138742]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:32 compute-0 sudo[138703]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:33 compute-0 sudo[138906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bejlvndruojzfzxnfvdejzcupiavqrew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096813.0868564-710-56799566397889/AnsiballZ_systemd.py'
Nov 25 18:53:33 compute-0 sudo[138906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:33 compute-0 podman[138867]: 2025-11-25 18:53:33.531789094 +0000 UTC m=+0.090492384 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 18:53:33 compute-0 python3.9[138914]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:33 compute-0 sudo[138906]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:34 compute-0 sudo[139067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doflloczoigmshkyoinwwqqnrtrujuop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096814.214645-710-199154987635904/AnsiballZ_systemd.py'
Nov 25 18:53:34 compute-0 sudo[139067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:35 compute-0 python3.9[139069]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:36 compute-0 systemd[1]: Reloading.
Nov 25 18:53:36 compute-0 systemd-rc-local-generator[139100]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:36 compute-0 systemd-sysv-generator[139104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:36 compute-0 sudo[139067]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:37 compute-0 sudo[139257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eemzjpljohxofafgjtanjbxavuxjkqed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096816.796275-782-180987280983782/AnsiballZ_systemd.py'
Nov 25 18:53:37 compute-0 sudo[139257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:37 compute-0 python3.9[139259]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 25 18:53:37 compute-0 systemd[1]: Reloading.
Nov 25 18:53:37 compute-0 systemd-sysv-generator[139293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:53:37 compute-0 systemd-rc-local-generator[139289]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:53:37 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 25 18:53:37 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 25 18:53:37 compute-0 sudo[139257]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:38 compute-0 sudo[139450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jehohgwfaeerdpmmpzzisgzyamrntvvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096818.2967424-798-261147766956006/AnsiballZ_systemd.py'
Nov 25 18:53:38 compute-0 sudo[139450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:39 compute-0 python3.9[139452]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:39 compute-0 sudo[139450]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:39 compute-0 sudo[139605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tszseuvfumocayhzzmzrvllolznedfmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096819.3110418-798-23135017133126/AnsiballZ_systemd.py'
Nov 25 18:53:39 compute-0 sudo[139605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:40 compute-0 python3.9[139607]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:40 compute-0 sudo[139605]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:40 compute-0 sudo[139760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlvbjwyespvgudblnuaddcrhuhyuraxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096820.3769615-798-84228845055411/AnsiballZ_systemd.py'
Nov 25 18:53:40 compute-0 sudo[139760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:41 compute-0 python3.9[139762]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:41 compute-0 sudo[139760]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:41 compute-0 sudo[139915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypverfxsfegdqyyqtxckbdztmrxkbmob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096821.4460373-798-197122047742834/AnsiballZ_systemd.py'
Nov 25 18:53:41 compute-0 sudo[139915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:42 compute-0 python3.9[139917]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:42 compute-0 sudo[139915]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:42 compute-0 sudo[140070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctsxlxntklyitklxoamzffcuzxoofnoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096822.48799-798-264637612984504/AnsiballZ_systemd.py'
Nov 25 18:53:42 compute-0 sudo[140070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:43 compute-0 python3.9[140072]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:43 compute-0 sudo[140070]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:43 compute-0 sudo[140225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eennmhkmqqiyrbhnwkeulcwngvwobgoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096823.5136993-798-268435171580058/AnsiballZ_systemd.py'
Nov 25 18:53:43 compute-0 sudo[140225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:44 compute-0 python3.9[140227]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:44 compute-0 sudo[140225]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:44 compute-0 sudo[140380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omfrcoesazdtedsudyrgdkvlouhkvvyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096824.4877563-798-10993569195081/AnsiballZ_systemd.py'
Nov 25 18:53:44 compute-0 sudo[140380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:45 compute-0 python3.9[140382]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:45 compute-0 sudo[140380]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:46 compute-0 sudo[140535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfdgmqeyvyyuvhsuxgeqxeqhfbsbywij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096825.588761-798-150847336275724/AnsiballZ_systemd.py'
Nov 25 18:53:46 compute-0 sudo[140535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:46 compute-0 python3.9[140537]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:46 compute-0 sudo[140535]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:47 compute-0 sudo[140690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uarorokdtufrrkkqlphajnzadtfjxoas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096826.6619246-798-220028092688386/AnsiballZ_systemd.py'
Nov 25 18:53:47 compute-0 sudo[140690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:47 compute-0 python3.9[140692]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:47 compute-0 sudo[140690]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:48 compute-0 sudo[140845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihpyqojvruxavfbebeodzokfwtjgjrfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096827.653758-798-128897786434013/AnsiballZ_systemd.py'
Nov 25 18:53:48 compute-0 sudo[140845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:48 compute-0 python3.9[140847]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:48 compute-0 sudo[140845]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:49 compute-0 sudo[141000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivdrjqqxokhrguputgwlokisemzmaloy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096828.589474-798-29429257619070/AnsiballZ_systemd.py'
Nov 25 18:53:49 compute-0 sudo[141000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:49 compute-0 python3.9[141002]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:49 compute-0 sudo[141000]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:50 compute-0 sudo[141155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ralihnaaxijlxxwfjagicmdhmzgjhjjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096829.9461753-798-243373687773331/AnsiballZ_systemd.py'
Nov 25 18:53:50 compute-0 sudo[141155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:50 compute-0 python3.9[141157]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:50 compute-0 sudo[141155]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:51 compute-0 sudo[141310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbifidrigfxtgowyxqxcxsfktgphzvya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096830.9364169-798-53572613075664/AnsiballZ_systemd.py'
Nov 25 18:53:51 compute-0 sudo[141310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:51 compute-0 python3.9[141312]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:51 compute-0 sudo[141310]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:52 compute-0 sudo[141465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuftaifldwceeipkyhtameysxhacnsom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096831.8920724-798-51493640775984/AnsiballZ_systemd.py'
Nov 25 18:53:52 compute-0 sudo[141465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:52 compute-0 python3.9[141467]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 25 18:53:52 compute-0 sudo[141465]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:54 compute-0 sudo[141620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsvxtpcumjymfilyqjeiotamsxqkielp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096833.6917915-1002-2518978896046/AnsiballZ_file.py'
Nov 25 18:53:54 compute-0 sudo[141620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:54 compute-0 python3.9[141622]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:53:54 compute-0 sudo[141620]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:54 compute-0 sudo[141772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppcljbaejncwqlgcbebrzpzlezcjwjju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096834.5131564-1002-186342065497311/AnsiballZ_file.py'
Nov 25 18:53:54 compute-0 sudo[141772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:55 compute-0 python3.9[141774]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:53:55 compute-0 sudo[141772]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:55 compute-0 sudo[141924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myqqzxvqujksxanbtjfmavvxifljdhyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096835.355499-1002-109634521133422/AnsiballZ_file.py'
Nov 25 18:53:55 compute-0 sudo[141924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:55 compute-0 python3.9[141926]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:53:55 compute-0 sudo[141924]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:56 compute-0 sudo[142076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qokfkaszffebsfcecbilkwgdentnznpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096836.0895147-1002-141855461913606/AnsiballZ_file.py'
Nov 25 18:53:56 compute-0 sudo[142076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:56 compute-0 python3.9[142078]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:53:56 compute-0 sudo[142076]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:57 compute-0 sudo[142228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iejslyijjfkpvacfcxuturqamlcigwrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096836.9459698-1002-238352412820304/AnsiballZ_file.py'
Nov 25 18:53:57 compute-0 sudo[142228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:57 compute-0 python3.9[142230]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:53:57 compute-0 sudo[142228]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:58 compute-0 sudo[142380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjluxnhdfoiybskohkcwwqbcxvduaoeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096837.7566903-1002-220637691029103/AnsiballZ_file.py'
Nov 25 18:53:58 compute-0 sudo[142380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:58 compute-0 python3.9[142382]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:53:58 compute-0 sudo[142380]: pam_unix(sudo:session): session closed for user root
Nov 25 18:53:59 compute-0 podman[142482]: 2025-11-25 18:53:59.281855481 +0000 UTC m=+0.186215128 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller)
Nov 25 18:53:59 compute-0 sudo[142555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkkunnauyxtntooszkrsyapumsuklrxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096838.6442113-1088-186924134511116/AnsiballZ_stat.py'
Nov 25 18:53:59 compute-0 sudo[142555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:53:59 compute-0 python3.9[142559]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:53:59 compute-0 sudo[142555]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:00 compute-0 sudo[142683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwhftokvpjhmygqvlntpoeadnxoxsjdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096838.6442113-1088-186924134511116/AnsiballZ_copy.py'
Nov 25 18:54:00 compute-0 sudo[142683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:00 compute-0 python3.9[142685]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764096838.6442113-1088-186924134511116/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:00 compute-0 sudo[142683]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:00 compute-0 sudo[142835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hduhdsfvgivthbfihvszolfupbnzwlhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096840.580281-1088-129320887125862/AnsiballZ_stat.py'
Nov 25 18:54:00 compute-0 sudo[142835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:01 compute-0 python3.9[142837]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:01 compute-0 sudo[142835]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:01 compute-0 sudo[142960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrtpjnsftekuenkqugrsvrntyhaxxgqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096840.580281-1088-129320887125862/AnsiballZ_copy.py'
Nov 25 18:54:01 compute-0 sudo[142960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:02 compute-0 python3.9[142962]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764096840.580281-1088-129320887125862/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:02 compute-0 sudo[142960]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:02 compute-0 sudo[143112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvfftzfakpzvogenjwhipokhuiiayrct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096842.1997106-1088-214742593707360/AnsiballZ_stat.py'
Nov 25 18:54:02 compute-0 sudo[143112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:02 compute-0 python3.9[143114]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:02 compute-0 sudo[143112]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:03 compute-0 sudo[143237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndylqoohdrhqyqvsumqmamtzwqnohmgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096842.1997106-1088-214742593707360/AnsiballZ_copy.py'
Nov 25 18:54:03 compute-0 sudo[143237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:03 compute-0 python3.9[143239]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764096842.1997106-1088-214742593707360/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:03 compute-0 sudo[143237]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:04 compute-0 sudo[143401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpanqbzegqtckcnwgckmuylpzlpzlvyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096843.6206589-1088-170717284402423/AnsiballZ_stat.py'
Nov 25 18:54:04 compute-0 sudo[143401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:04 compute-0 podman[143363]: 2025-11-25 18:54:04.046029146 +0000 UTC m=+0.081392221 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 18:54:04 compute-0 python3.9[143409]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:04 compute-0 sudo[143401]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:04 compute-0 sudo[143532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-radvugbklprskoqkoeuidbantcuzmmnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096843.6206589-1088-170717284402423/AnsiballZ_copy.py'
Nov 25 18:54:04 compute-0 sudo[143532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:04 compute-0 python3.9[143534]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764096843.6206589-1088-170717284402423/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:04 compute-0 sudo[143532]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:05 compute-0 sudo[143684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iabsxvinczyrsrmejxrqmqanoimosxhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096845.1320853-1088-208492862490336/AnsiballZ_stat.py'
Nov 25 18:54:05 compute-0 sudo[143684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:05 compute-0 python3.9[143686]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:05 compute-0 sudo[143684]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:06 compute-0 sudo[143809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewryuogufmixlqudznbiesiokzapcmhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096845.1320853-1088-208492862490336/AnsiballZ_copy.py'
Nov 25 18:54:06 compute-0 sudo[143809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:06 compute-0 python3.9[143811]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764096845.1320853-1088-208492862490336/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:06 compute-0 sudo[143809]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:07 compute-0 sudo[143961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fukfrrdzfebhmvspuriqbrshmwoxwotl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096846.827803-1088-84513459957052/AnsiballZ_stat.py'
Nov 25 18:54:07 compute-0 sudo[143961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:07 compute-0 python3.9[143963]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:07 compute-0 sudo[143961]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:08 compute-0 sudo[144086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekdqmrxglpdjcxhweelyclrielgtyhhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096846.827803-1088-84513459957052/AnsiballZ_copy.py'
Nov 25 18:54:08 compute-0 sudo[144086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:08 compute-0 python3.9[144088]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764096846.827803-1088-84513459957052/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:08 compute-0 sudo[144086]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:08 compute-0 sudo[144238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttfxhnwkqfkmhmzxsohiywzdujvlifzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096848.5527048-1088-79277913315179/AnsiballZ_stat.py'
Nov 25 18:54:08 compute-0 sudo[144238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:09 compute-0 python3.9[144240]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:09 compute-0 sudo[144238]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:09 compute-0 sudo[144361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzvzbasuvpastqujferbdpolemhqyvhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096848.5527048-1088-79277913315179/AnsiballZ_copy.py'
Nov 25 18:54:09 compute-0 sudo[144361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:09 compute-0 python3.9[144363]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764096848.5527048-1088-79277913315179/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:09 compute-0 sudo[144361]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:10 compute-0 sudo[144513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eotpuhcamzjepzntnaakyxqedvkjvoym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096850.1263192-1088-239419999614665/AnsiballZ_stat.py'
Nov 25 18:54:10 compute-0 sudo[144513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:10 compute-0 python3.9[144515]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:10 compute-0 sudo[144513]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:11 compute-0 sudo[144638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghuiujhfwlikixjyjwqwnzcxgupvfrus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096850.1263192-1088-239419999614665/AnsiballZ_copy.py'
Nov 25 18:54:11 compute-0 sudo[144638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:11 compute-0 python3.9[144640]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764096850.1263192-1088-239419999614665/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:11 compute-0 sudo[144638]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:12 compute-0 sudo[144790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zioelnswgoitdcbgjqpatvqmxkgoslsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096851.9934387-1314-13925873764411/AnsiballZ_command.py'
Nov 25 18:54:12 compute-0 sudo[144790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:12 compute-0 python3.9[144792]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 25 18:54:12 compute-0 sudo[144790]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:13 compute-0 sudo[144943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqflqayppnatcsxclvgldbboxrpigmya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096852.9593105-1332-119861838663218/AnsiballZ_file.py'
Nov 25 18:54:13 compute-0 sudo[144943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:13 compute-0 python3.9[144945]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:13 compute-0 sudo[144943]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:14 compute-0 sudo[145095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sveywkhwcpjwkmbcdmypvzlyhjokbdae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096853.711868-1332-193094776657806/AnsiballZ_file.py'
Nov 25 18:54:14 compute-0 sudo[145095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:14 compute-0 python3.9[145097]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:14 compute-0 sudo[145095]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:14 compute-0 sudo[145247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iinprjmhgnxisbcskivkifztagwcriun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096854.409089-1332-30578444598335/AnsiballZ_file.py'
Nov 25 18:54:14 compute-0 sudo[145247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:14 compute-0 python3.9[145249]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:14 compute-0 sudo[145247]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:15 compute-0 sudo[145399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sutakcqhdixjwcewcuqeezhytiehdxrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096855.0633044-1332-40423814974892/AnsiballZ_file.py'
Nov 25 18:54:15 compute-0 sudo[145399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:15 compute-0 python3.9[145401]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:15 compute-0 sudo[145399]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:16 compute-0 sudo[145551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utfgnvhaakwveulflbtnmtfvugcyfhyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096855.9252393-1332-129729868075614/AnsiballZ_file.py'
Nov 25 18:54:16 compute-0 sudo[145551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:16 compute-0 python3.9[145553]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:16 compute-0 sudo[145551]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:16 compute-0 sudo[145703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnugtkjhkbtmdfgwudeydoanxmbshedk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096856.616761-1332-216557488362656/AnsiballZ_file.py'
Nov 25 18:54:16 compute-0 sudo[145703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:17 compute-0 python3.9[145705]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:17 compute-0 sudo[145703]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:17 compute-0 sudo[145855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evkqgubsqjhvydmbvnwefjaikusuwxqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096857.3404157-1332-246191505180118/AnsiballZ_file.py'
Nov 25 18:54:17 compute-0 sudo[145855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:17 compute-0 python3.9[145857]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:17 compute-0 sudo[145855]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:18 compute-0 sudo[146007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiqdvqjzrmfxymalqvvglzgtmhqlwklr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096858.027674-1332-6115137205908/AnsiballZ_file.py'
Nov 25 18:54:18 compute-0 sudo[146007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:18 compute-0 python3.9[146009]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:18 compute-0 sudo[146007]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:19 compute-0 sudo[146159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sneixcdxnievlzlcvqqlpkvvjvctnzuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096858.760918-1332-6069852614562/AnsiballZ_file.py'
Nov 25 18:54:19 compute-0 sudo[146159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:19 compute-0 python3.9[146161]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:19 compute-0 sudo[146159]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:19 compute-0 sudo[146311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehtpqqaykrmsotbgnhmthzxbnkbyeies ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096859.573423-1332-177206641018551/AnsiballZ_file.py'
Nov 25 18:54:19 compute-0 sudo[146311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:20 compute-0 python3.9[146313]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:20 compute-0 sudo[146311]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:20 compute-0 sudo[146463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldhjnkeuomxrykofdxvnmrokxyfazfvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096860.4085824-1332-21820478292384/AnsiballZ_file.py'
Nov 25 18:54:20 compute-0 sudo[146463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:21 compute-0 python3.9[146465]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:21 compute-0 sudo[146463]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:21 compute-0 sudo[146615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzbycudsyivbzxislcykiusufqcblwjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096861.2606094-1332-267330190777884/AnsiballZ_file.py'
Nov 25 18:54:21 compute-0 sudo[146615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:21 compute-0 python3.9[146617]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:21 compute-0 sudo[146615]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:22 compute-0 sudo[146767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elrueyqnpduhwmwruttjaqflfkbdgkiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096862.0829537-1332-53305276052335/AnsiballZ_file.py'
Nov 25 18:54:22 compute-0 sudo[146767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:22 compute-0 python3.9[146769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:22 compute-0 sudo[146767]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:23 compute-0 sudo[146919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijrcvfftoodmkjyttxfxynngmykvasgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096862.9148843-1332-39072649591400/AnsiballZ_file.py'
Nov 25 18:54:23 compute-0 sudo[146919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:23 compute-0 python3.9[146921]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:23 compute-0 sudo[146919]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:25 compute-0 sudo[147071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcubsxqxiolonriekngythloghquwwsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096864.9616838-1530-214833352863410/AnsiballZ_stat.py'
Nov 25 18:54:25 compute-0 sudo[147071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:25 compute-0 python3.9[147073]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:25 compute-0 sudo[147071]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:26 compute-0 sudo[147194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atlsioexjswvhdyrzseyxyzftlhobvke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096864.9616838-1530-214833352863410/AnsiballZ_copy.py'
Nov 25 18:54:26 compute-0 sudo[147194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:26 compute-0 python3.9[147196]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096864.9616838-1530-214833352863410/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:26 compute-0 sudo[147194]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:27 compute-0 sudo[147346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qktdxemvpokuoxqhnpjqpgototygndxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096866.5419645-1530-149912691478996/AnsiballZ_stat.py'
Nov 25 18:54:27 compute-0 sudo[147346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:27 compute-0 python3.9[147348]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:27 compute-0 sudo[147346]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:27 compute-0 sudo[147469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipmslupcyqxsstosnqvhbqgzzcmlhful ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096866.5419645-1530-149912691478996/AnsiballZ_copy.py'
Nov 25 18:54:27 compute-0 sudo[147469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:27 compute-0 python3.9[147471]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096866.5419645-1530-149912691478996/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:27 compute-0 sudo[147469]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:28 compute-0 sudo[147621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyhpgaegjnbrrjydarjrqbdjjspuibjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096868.0899026-1530-268995573024166/AnsiballZ_stat.py'
Nov 25 18:54:28 compute-0 sudo[147621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:28 compute-0 python3.9[147623]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:28 compute-0 sudo[147621]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:29 compute-0 sudo[147744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meynoufjobzeuuloguiqcqnzaseqiagt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096868.0899026-1530-268995573024166/AnsiballZ_copy.py'
Nov 25 18:54:29 compute-0 sudo[147744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:29 compute-0 python3.9[147746]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096868.0899026-1530-268995573024166/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:29 compute-0 sudo[147744]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:30 compute-0 sudo[147909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzbgsaafwgfanbrgrvbrpagaxfkozmpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096869.6766925-1530-229683767724211/AnsiballZ_stat.py'
Nov 25 18:54:30 compute-0 sudo[147909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:30 compute-0 podman[147870]: 2025-11-25 18:54:30.138071134 +0000 UTC m=+0.131438019 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Nov 25 18:54:30 compute-0 python3.9[147916]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:30 compute-0 sudo[147909]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:30 compute-0 sudo[148045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esumzxxzmmyvjyrkxsrdxtgasxlrxihk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096869.6766925-1530-229683767724211/AnsiballZ_copy.py'
Nov 25 18:54:30 compute-0 sudo[148045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:54:31.033 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:54:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:54:31.034 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:54:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:54:31.034 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:54:31 compute-0 python3.9[148047]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096869.6766925-1530-229683767724211/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:31 compute-0 sudo[148045]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:31 compute-0 sudo[148198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyzqbksmduxahjarwwcelngyqqtvkiwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096871.250152-1530-199124014163579/AnsiballZ_stat.py'
Nov 25 18:54:31 compute-0 sudo[148198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:31 compute-0 python3.9[148200]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:31 compute-0 sudo[148198]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:32 compute-0 sudo[148321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijqgdimkbjqrlusikmpubhoexsgflnur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096871.250152-1530-199124014163579/AnsiballZ_copy.py'
Nov 25 18:54:32 compute-0 sudo[148321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:32 compute-0 python3.9[148323]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096871.250152-1530-199124014163579/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:32 compute-0 sudo[148321]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:32 compute-0 sudo[148473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akhpzdhnvnwucrgdxrdfpsptmlizkjlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096872.6140785-1530-106328662080995/AnsiballZ_stat.py'
Nov 25 18:54:32 compute-0 sudo[148473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:33 compute-0 python3.9[148475]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:33 compute-0 sudo[148473]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:33 compute-0 sudo[148596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ielazicgkdmztqfubhndiffyuifknqdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096872.6140785-1530-106328662080995/AnsiballZ_copy.py'
Nov 25 18:54:33 compute-0 sudo[148596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:34 compute-0 python3.9[148598]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096872.6140785-1530-106328662080995/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:34 compute-0 sudo[148596]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:34 compute-0 podman[148722]: 2025-11-25 18:54:34.608382124 +0000 UTC m=+0.066678202 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 25 18:54:34 compute-0 sudo[148766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaajrccblvnjpcwxiamnosdwaereabyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096874.240436-1530-267549245727717/AnsiballZ_stat.py'
Nov 25 18:54:34 compute-0 sudo[148766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:34 compute-0 python3.9[148770]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:34 compute-0 sudo[148766]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:35 compute-0 sudo[148891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brvjzmspieekshnlzgosfovwzkrsrnib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096874.240436-1530-267549245727717/AnsiballZ_copy.py'
Nov 25 18:54:35 compute-0 sudo[148891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:35 compute-0 python3.9[148893]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096874.240436-1530-267549245727717/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:35 compute-0 sudo[148891]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:36 compute-0 sudo[149043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avtjldefhiqwifrlnsmiidhqfsylkijw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096875.6858087-1530-80659974635206/AnsiballZ_stat.py'
Nov 25 18:54:36 compute-0 sudo[149043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:36 compute-0 python3.9[149045]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:36 compute-0 sudo[149043]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:36 compute-0 sudo[149166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfsxggteksxggkptqrgeohxuvjhvqeun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096875.6858087-1530-80659974635206/AnsiballZ_copy.py'
Nov 25 18:54:36 compute-0 sudo[149166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:37 compute-0 python3.9[149168]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096875.6858087-1530-80659974635206/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:37 compute-0 sudo[149166]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:37 compute-0 sudo[149318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxcuigcyccsbzznlclassagwaeekwfhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096877.1789699-1530-162015411259060/AnsiballZ_stat.py'
Nov 25 18:54:37 compute-0 sudo[149318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:37 compute-0 python3.9[149320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:37 compute-0 sudo[149318]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:38 compute-0 sudo[149441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpiwydfsqmnminhqpomfhgbvmhstqlhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096877.1789699-1530-162015411259060/AnsiballZ_copy.py'
Nov 25 18:54:38 compute-0 sudo[149441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:38 compute-0 python3.9[149443]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096877.1789699-1530-162015411259060/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:38 compute-0 sudo[149441]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:39 compute-0 sudo[149593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvvtcqshqodrgjmgngqncstcrirjgywz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096878.6707313-1530-148869433968914/AnsiballZ_stat.py'
Nov 25 18:54:39 compute-0 sudo[149593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:39 compute-0 python3.9[149595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:39 compute-0 sudo[149593]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:39 compute-0 sudo[149716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zafuhhzoiptqbkclnvenghtdfufxppik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096878.6707313-1530-148869433968914/AnsiballZ_copy.py'
Nov 25 18:54:39 compute-0 sudo[149716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:40 compute-0 python3.9[149718]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096878.6707313-1530-148869433968914/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:40 compute-0 sudo[149716]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:40 compute-0 sudo[149868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaaawbvembrctpxwqwzhhudvlrycxuhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096880.2256467-1530-132190852198283/AnsiballZ_stat.py'
Nov 25 18:54:40 compute-0 sudo[149868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:40 compute-0 python3.9[149870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:40 compute-0 sudo[149868]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:41 compute-0 sudo[149991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhzuudvsmpxzpqyisiypahcrqgppvgpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096880.2256467-1530-132190852198283/AnsiballZ_copy.py'
Nov 25 18:54:41 compute-0 sudo[149991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:41 compute-0 python3.9[149993]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096880.2256467-1530-132190852198283/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:41 compute-0 sudo[149991]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:42 compute-0 sudo[150143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcgrideszmgcdoxhztukbkzxpkhlolth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096881.7004344-1530-52122759119823/AnsiballZ_stat.py'
Nov 25 18:54:42 compute-0 sudo[150143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:42 compute-0 python3.9[150145]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:42 compute-0 sudo[150143]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:42 compute-0 sudo[150266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkjavypqavljjklaoqdxfilmrglaufyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096881.7004344-1530-52122759119823/AnsiballZ_copy.py'
Nov 25 18:54:42 compute-0 sudo[150266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:42 compute-0 python3.9[150268]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096881.7004344-1530-52122759119823/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:42 compute-0 sudo[150266]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:43 compute-0 sudo[150418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxkjaxjvtmvqlctdbapexbikiqrtikeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096883.1768365-1530-171329607206219/AnsiballZ_stat.py'
Nov 25 18:54:43 compute-0 sudo[150418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:43 compute-0 python3.9[150420]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:43 compute-0 sudo[150418]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:44 compute-0 sudo[150541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzdnbeelmsqfotormuufabapofhkbdus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096883.1768365-1530-171329607206219/AnsiballZ_copy.py'
Nov 25 18:54:44 compute-0 sudo[150541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:44 compute-0 python3.9[150543]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096883.1768365-1530-171329607206219/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:44 compute-0 sudo[150541]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:45 compute-0 sudo[150693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auwajlyhbsfhgbxjflfucnrpdnnxwgsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096884.6137476-1530-40293075917098/AnsiballZ_stat.py'
Nov 25 18:54:45 compute-0 sudo[150693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:45 compute-0 python3.9[150695]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:54:45 compute-0 sudo[150693]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:45 compute-0 sudo[150816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wenyibmdoasjhjfefmojqzetoqopbnao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096884.6137476-1530-40293075917098/AnsiballZ_copy.py'
Nov 25 18:54:45 compute-0 sudo[150816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:45 compute-0 python3.9[150818]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096884.6137476-1530-40293075917098/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:45 compute-0 sudo[150816]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:46 compute-0 python3.9[150968]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:54:47 compute-0 sudo[151121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tffbwtoncudbnaulfuqfzozvwyjpvbef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096887.0547943-1942-116720455152975/AnsiballZ_seboolean.py'
Nov 25 18:54:47 compute-0 sudo[151121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:47 compute-0 python3.9[151123]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 25 18:54:50 compute-0 sudo[151121]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:50 compute-0 sudo[151277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcymzzkxhexvrtdbbllfwxtxoykqdqrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096890.3612838-1958-80470196981380/AnsiballZ_copy.py'
Nov 25 18:54:50 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 25 18:54:50 compute-0 sudo[151277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:51 compute-0 python3.9[151279]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:51 compute-0 sudo[151277]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:51 compute-0 sudo[151429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azahpndivcmhvoeewalyymmgcxldcfzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096891.2079408-1958-159072683832179/AnsiballZ_copy.py'
Nov 25 18:54:51 compute-0 sudo[151429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:51 compute-0 python3.9[151431]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:51 compute-0 sudo[151429]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:52 compute-0 sudo[151581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fudyvnqzbjptgbsdpbfudovqjzjydpil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096891.9860086-1958-182215446079587/AnsiballZ_copy.py'
Nov 25 18:54:52 compute-0 sudo[151581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:52 compute-0 python3.9[151583]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:52 compute-0 sudo[151581]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:52 compute-0 sudo[151733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlynpvvdgdolvkguztdjrxvvrbddxpbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096892.6965106-1958-108200946846598/AnsiballZ_copy.py'
Nov 25 18:54:52 compute-0 sudo[151733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:53 compute-0 python3.9[151735]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:53 compute-0 sudo[151733]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:53 compute-0 sudo[151885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsffqnkfpxxahfijvlecjpbbngzlcilk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096893.4054518-1958-33149554725809/AnsiballZ_copy.py'
Nov 25 18:54:53 compute-0 sudo[151885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:53 compute-0 python3.9[151887]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:53 compute-0 sudo[151885]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:54 compute-0 sudo[152037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axjrkafwddetvwlirwmqwdxeblptipqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096894.2781777-2030-43802654433115/AnsiballZ_copy.py'
Nov 25 18:54:54 compute-0 sudo[152037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:54 compute-0 python3.9[152039]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:54 compute-0 sudo[152037]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:55 compute-0 sudo[152189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxahostmaiklpdzxxutcxqotjloxwggt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096895.0640152-2030-275066357444239/AnsiballZ_copy.py'
Nov 25 18:54:55 compute-0 sudo[152189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:55 compute-0 python3.9[152191]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:55 compute-0 sudo[152189]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:56 compute-0 sudo[152341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dugdugjyqvryosnaumylqbgbijemzhpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096895.801813-2030-64677905699630/AnsiballZ_copy.py'
Nov 25 18:54:56 compute-0 sudo[152341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:56 compute-0 python3.9[152343]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:56 compute-0 sudo[152341]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:56 compute-0 sudo[152493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahdvtjnvcraeisgljnkzbrofkqgkssix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096896.580847-2030-261144054890219/AnsiballZ_copy.py'
Nov 25 18:54:56 compute-0 sudo[152493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:57 compute-0 python3.9[152495]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:57 compute-0 sudo[152493]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:57 compute-0 sudo[152645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vknezcjkcuvmqqgmzncjzsjgolyiqrbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096897.336967-2030-55997991440161/AnsiballZ_copy.py'
Nov 25 18:54:57 compute-0 sudo[152645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:57 compute-0 python3.9[152647]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:54:57 compute-0 sudo[152645]: pam_unix(sudo:session): session closed for user root
Nov 25 18:54:58 compute-0 sudo[152797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pifqmcuiuhepjzgxwsfedvldhziccnrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096898.2806022-2102-17403965148423/AnsiballZ_systemd.py'
Nov 25 18:54:58 compute-0 sudo[152797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:54:58 compute-0 python3.9[152799]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:54:58 compute-0 systemd[1]: Reloading.
Nov 25 18:54:59 compute-0 systemd-sysv-generator[152830]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:54:59 compute-0 systemd-rc-local-generator[152827]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:54:59 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Nov 25 18:54:59 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Nov 25 18:54:59 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 25 18:54:59 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 25 18:54:59 compute-0 systemd[1]: Starting libvirt logging daemon...
Nov 25 18:54:59 compute-0 systemd[1]: Started libvirt logging daemon.
Nov 25 18:54:59 compute-0 sudo[152797]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:00 compute-0 sudo[152990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paaqkaluuhhalueynztqyeuozhmxziee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096899.7084205-2102-53731631168027/AnsiballZ_systemd.py'
Nov 25 18:55:00 compute-0 sudo[152990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:00 compute-0 python3.9[152992]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:55:00 compute-0 systemd[1]: Reloading.
Nov 25 18:55:00 compute-0 systemd-sysv-generator[153041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:55:00 compute-0 systemd-rc-local-generator[153035]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:55:00 compute-0 podman[152994]: 2025-11-25 18:55:00.552299903 +0000 UTC m=+0.137158025 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller)
Nov 25 18:55:00 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 25 18:55:00 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 25 18:55:00 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 25 18:55:00 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 25 18:55:00 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 25 18:55:00 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 25 18:55:00 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 25 18:55:00 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 25 18:55:00 compute-0 sudo[152990]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:01 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 25 18:55:01 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 25 18:55:01 compute-0 sudo[153232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lztwegsjpnuhjicoebwhojlenhfzdtif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096901.0180893-2102-103766245886726/AnsiballZ_systemd.py'
Nov 25 18:55:01 compute-0 sudo[153232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:01 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 25 18:55:01 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 25 18:55:01 compute-0 python3.9[153234]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:55:01 compute-0 systemd[1]: Reloading.
Nov 25 18:55:01 compute-0 systemd-rc-local-generator[153270]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:55:01 compute-0 systemd-sysv-generator[153274]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:55:02 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 25 18:55:02 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 25 18:55:02 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 25 18:55:02 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 25 18:55:02 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 25 18:55:02 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 25 18:55:02 compute-0 sudo[153232]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:02 compute-0 setroubleshoot[153129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 11a1d5ef-7c8d-4f40-81af-01a54277dd8d
Nov 25 18:55:02 compute-0 setroubleshoot[153129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 25 18:55:02 compute-0 setroubleshoot[153129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 11a1d5ef-7c8d-4f40-81af-01a54277dd8d
Nov 25 18:55:02 compute-0 setroubleshoot[153129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 25 18:55:02 compute-0 sudo[153452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlgzbkcmneorovffhwnbythdoknqhsia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096902.4520903-2102-1993717966646/AnsiballZ_systemd.py'
Nov 25 18:55:02 compute-0 sudo[153452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:03 compute-0 python3.9[153454]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:55:03 compute-0 systemd[1]: Reloading.
Nov 25 18:55:03 compute-0 systemd-sysv-generator[153484]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:55:03 compute-0 systemd-rc-local-generator[153476]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:55:03 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Nov 25 18:55:03 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 25 18:55:03 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 25 18:55:03 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 25 18:55:03 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 25 18:55:03 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 25 18:55:03 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 25 18:55:03 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 25 18:55:03 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 25 18:55:03 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 25 18:55:03 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 25 18:55:03 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 25 18:55:03 compute-0 sudo[153452]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:04 compute-0 sudo[153667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpufyxylwvjcderhrpmauhpaxkknjoes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096903.971202-2102-33718629886374/AnsiballZ_systemd.py'
Nov 25 18:55:04 compute-0 sudo[153667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:04 compute-0 python3.9[153669]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:55:04 compute-0 systemd[1]: Reloading.
Nov 25 18:55:04 compute-0 systemd-rc-local-generator[153710]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:55:04 compute-0 systemd-sysv-generator[153716]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:55:04 compute-0 podman[153671]: 2025-11-25 18:55:04.900253011 +0000 UTC m=+0.112649646 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 18:55:05 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Nov 25 18:55:05 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Nov 25 18:55:05 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 25 18:55:05 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 25 18:55:05 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 25 18:55:05 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 25 18:55:05 compute-0 systemd[1]: Starting libvirt secret daemon...
Nov 25 18:55:05 compute-0 systemd[1]: Started libvirt secret daemon.
Nov 25 18:55:05 compute-0 sudo[153667]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:06 compute-0 sudo[153897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iowqnukucpvtyomoilxjbfpqmyirzkuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096906.2548735-2176-45162380238991/AnsiballZ_file.py'
Nov 25 18:55:06 compute-0 sudo[153897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:06 compute-0 python3.9[153899]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:06 compute-0 sudo[153897]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:07 compute-0 sudo[154049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gssntvsjahkuxggxfmzcwnigcvukmjia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096907.1685722-2192-1956687344678/AnsiballZ_find.py'
Nov 25 18:55:07 compute-0 sudo[154049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:07 compute-0 python3.9[154051]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 18:55:07 compute-0 sudo[154049]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:08 compute-0 sudo[154201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arpuvjxkjdumdlawmjytxupsxbhdeuak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096908.2637768-2220-161138161877382/AnsiballZ_stat.py'
Nov 25 18:55:08 compute-0 sudo[154201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:08 compute-0 python3.9[154203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:08 compute-0 sudo[154201]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:09 compute-0 sudo[154324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahuzcvdfauxgcknuoppkqwkxposwmpqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096908.2637768-2220-161138161877382/AnsiballZ_copy.py'
Nov 25 18:55:09 compute-0 sudo[154324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:09 compute-0 python3.9[154326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096908.2637768-2220-161138161877382/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:09 compute-0 sudo[154324]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:10 compute-0 sudo[154476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxepwdplfzpgclugonybtgbzgahyhhgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096909.9530025-2252-63106120756663/AnsiballZ_file.py'
Nov 25 18:55:10 compute-0 sudo[154476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:10 compute-0 python3.9[154478]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:10 compute-0 sudo[154476]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:11 compute-0 sudo[154628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plwdkttmcgkezlewjffymjiiaujenlbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096910.8446565-2268-148667930334320/AnsiballZ_stat.py'
Nov 25 18:55:11 compute-0 sudo[154628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:11 compute-0 python3.9[154630]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:11 compute-0 sudo[154628]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:11 compute-0 sudo[154706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwoulhkqskxucikvdfevxgnatjidldcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096910.8446565-2268-148667930334320/AnsiballZ_file.py'
Nov 25 18:55:11 compute-0 sudo[154706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:11 compute-0 python3.9[154708]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:11 compute-0 sudo[154706]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:12 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 25 18:55:12 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.036s CPU time.
Nov 25 18:55:12 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 25 18:55:12 compute-0 sudo[154858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leffjboszeybgejmndbsbfufvylcxccl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096912.321871-2292-49585673061488/AnsiballZ_stat.py'
Nov 25 18:55:12 compute-0 sudo[154858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:12 compute-0 python3.9[154860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:12 compute-0 sudo[154858]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:13 compute-0 sudo[154936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvxjiygbhaubjdfxdzwiejhmptkwqomx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096912.321871-2292-49585673061488/AnsiballZ_file.py'
Nov 25 18:55:13 compute-0 sudo[154936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:13 compute-0 python3.9[154938]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6vao9xcn recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:13 compute-0 sudo[154936]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:14 compute-0 sudo[155088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgovuqiyhummlcifdcuvftjxitcprlso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096913.711317-2316-270969763134996/AnsiballZ_stat.py'
Nov 25 18:55:14 compute-0 sudo[155088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:14 compute-0 python3.9[155090]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:14 compute-0 sudo[155088]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:14 compute-0 sudo[155166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grprrkhsqipxqoxepdgaytstgjcxpfgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096913.711317-2316-270969763134996/AnsiballZ_file.py'
Nov 25 18:55:14 compute-0 sudo[155166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:14 compute-0 python3.9[155168]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:14 compute-0 sudo[155166]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:15 compute-0 sudo[155318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbslglhdjdgiskifnegsyvbbqxnkltfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096915.1764345-2342-95515281504321/AnsiballZ_command.py'
Nov 25 18:55:15 compute-0 sudo[155318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:15 compute-0 python3.9[155320]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:55:15 compute-0 sudo[155318]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:16 compute-0 sudo[155471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eicsdkroxolsvvhaadlhlpbrsbzlesje ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764096916.110034-2358-70983783410609/AnsiballZ_edpm_nftables_from_files.py'
Nov 25 18:55:16 compute-0 sudo[155471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:16 compute-0 python3[155473]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 18:55:16 compute-0 sudo[155471]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:17 compute-0 sudo[155623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgdiihbxgfihtxppvgvdfjvovkzigckm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096917.0714793-2374-105179002721559/AnsiballZ_stat.py'
Nov 25 18:55:17 compute-0 sudo[155623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:17 compute-0 python3.9[155625]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:17 compute-0 sudo[155623]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:17 compute-0 sudo[155701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcrcsetdtakxwugisufvkgiqnrxorpya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096917.0714793-2374-105179002721559/AnsiballZ_file.py'
Nov 25 18:55:17 compute-0 sudo[155701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:18 compute-0 python3.9[155703]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:18 compute-0 sudo[155701]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:19 compute-0 sudo[155853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkpjhveycvococonenymjnzmbvcupznk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096918.7257419-2398-130051373022439/AnsiballZ_stat.py'
Nov 25 18:55:19 compute-0 sudo[155853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:19 compute-0 python3.9[155855]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:19 compute-0 sudo[155853]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:19 compute-0 sudo[155931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hubsjnkcjwjanvqjniwtbysekvljifwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096918.7257419-2398-130051373022439/AnsiballZ_file.py'
Nov 25 18:55:19 compute-0 sudo[155931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:19 compute-0 python3.9[155933]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:19 compute-0 sudo[155931]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:20 compute-0 sudo[156083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hezdntautioimvavtmkvliforvpgajds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096920.0782113-2422-162197164139293/AnsiballZ_stat.py'
Nov 25 18:55:20 compute-0 sudo[156083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:20 compute-0 python3.9[156085]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:20 compute-0 sudo[156083]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:21 compute-0 sudo[156161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhevfakiknwspqqvaufhxmpknkztafpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096920.0782113-2422-162197164139293/AnsiballZ_file.py'
Nov 25 18:55:21 compute-0 sudo[156161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:21 compute-0 python3.9[156163]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:21 compute-0 sudo[156161]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:21 compute-0 sudo[156313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olejhmmvdniblsekdwdxsknnctqfuilz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096921.4674754-2446-65922444008910/AnsiballZ_stat.py'
Nov 25 18:55:21 compute-0 sudo[156313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:22 compute-0 python3.9[156315]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:22 compute-0 sudo[156313]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:22 compute-0 sudo[156391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojxbwistpopgluxfcdriorxijcfpgond ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096921.4674754-2446-65922444008910/AnsiballZ_file.py'
Nov 25 18:55:22 compute-0 sudo[156391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:22 compute-0 python3.9[156393]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:22 compute-0 sudo[156391]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:23 compute-0 sudo[156543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlrygndnjvttmuiujcounnohxpclqsym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096922.875198-2470-272208307992903/AnsiballZ_stat.py'
Nov 25 18:55:23 compute-0 sudo[156543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:23 compute-0 python3.9[156545]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:23 compute-0 sudo[156543]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:24 compute-0 sudo[156668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agpnvhqbnsrgfvrdvotyngiapltetnbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096922.875198-2470-272208307992903/AnsiballZ_copy.py'
Nov 25 18:55:24 compute-0 sudo[156668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:24 compute-0 python3.9[156670]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764096922.875198-2470-272208307992903/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:24 compute-0 sudo[156668]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:25 compute-0 sudo[156820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhkhuguzawjzoylgwlpmrqmttymplxva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096924.6610613-2500-140290426902064/AnsiballZ_file.py'
Nov 25 18:55:25 compute-0 sudo[156820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:25 compute-0 python3.9[156822]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:25 compute-0 sudo[156820]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:25 compute-0 sudo[156972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpiuamlwsbsxqdimmlbtmhgoajicriqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096925.5171432-2516-179712406530618/AnsiballZ_command.py'
Nov 25 18:55:25 compute-0 sudo[156972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:26 compute-0 python3.9[156974]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:55:26 compute-0 sudo[156972]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:27 compute-0 sudo[157127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbvntexifohkdvxklrwhgrxttmyseorx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096926.4191215-2532-234003786897922/AnsiballZ_blockinfile.py'
Nov 25 18:55:27 compute-0 sudo[157127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:27 compute-0 python3.9[157129]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:27 compute-0 sudo[157127]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:28 compute-0 sudo[157279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yphqfcgznormyzgctooqblubrgyroekm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096927.6318095-2550-124635861307352/AnsiballZ_command.py'
Nov 25 18:55:28 compute-0 sudo[157279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:28 compute-0 python3.9[157281]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:55:28 compute-0 sudo[157279]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:28 compute-0 sudo[157432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaxctfekrfcdqeycukompwvnxmgjxktn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096928.489987-2566-197902826098713/AnsiballZ_stat.py'
Nov 25 18:55:28 compute-0 sudo[157432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:29 compute-0 python3.9[157434]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:55:29 compute-0 sudo[157432]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:29 compute-0 sudo[157586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdxcpubcisxjqxfpfccljdptlqqbumlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096929.4182763-2582-121265767974056/AnsiballZ_command.py'
Nov 25 18:55:29 compute-0 sudo[157586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:30 compute-0 python3.9[157588]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:55:30 compute-0 sudo[157586]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:55:31.036 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:55:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:55:31.038 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:55:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:55:31.038 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:55:31 compute-0 podman[157670]: 2025-11-25 18:55:31.246281678 +0000 UTC m=+0.162839994 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 18:55:31 compute-0 sudo[157768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osmzlzvfxlnhccvmyvfkfywbmktxvsww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096930.927418-2598-215404971014505/AnsiballZ_file.py'
Nov 25 18:55:31 compute-0 sudo[157768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:31 compute-0 python3.9[157770]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:31 compute-0 sudo[157768]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:32 compute-0 sudo[157920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stevbdqmouwkcrsffefstaulrtmabaio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096931.809021-2614-273176929982658/AnsiballZ_stat.py'
Nov 25 18:55:32 compute-0 sudo[157920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:32 compute-0 python3.9[157922]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:32 compute-0 sudo[157920]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:32 compute-0 sudo[158043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcysxpbklawdskxnmwlcfekuplunhvwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096931.809021-2614-273176929982658/AnsiballZ_copy.py'
Nov 25 18:55:32 compute-0 sudo[158043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:33 compute-0 python3.9[158045]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096931.809021-2614-273176929982658/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:33 compute-0 sudo[158043]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:33 compute-0 sudo[158195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nklhkhylejysacpsajqyisouzetcmiax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096933.3559244-2644-188372862684241/AnsiballZ_stat.py'
Nov 25 18:55:33 compute-0 sudo[158195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:33 compute-0 python3.9[158197]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:33 compute-0 sudo[158195]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:34 compute-0 sudo[158318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enmlcgcxigfpvtkkmvftgpbgpfzcvlhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096933.3559244-2644-188372862684241/AnsiballZ_copy.py'
Nov 25 18:55:34 compute-0 sudo[158318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:34 compute-0 python3.9[158320]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096933.3559244-2644-188372862684241/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:34 compute-0 sudo[158318]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:35 compute-0 podman[158397]: 2025-11-25 18:55:35.164379143 +0000 UTC m=+0.080338505 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest)
Nov 25 18:55:35 compute-0 sudo[158489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvtffikfwbryfmjsefuabtbwowyzcidm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096934.977174-2674-172012057400886/AnsiballZ_stat.py'
Nov 25 18:55:35 compute-0 sudo[158489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:35 compute-0 python3.9[158491]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:55:35 compute-0 sudo[158489]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:36 compute-0 sudo[158612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boknzgvjtfnagwgfvrsmzmnztqhotjjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096934.977174-2674-172012057400886/AnsiballZ_copy.py'
Nov 25 18:55:36 compute-0 sudo[158612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:36 compute-0 python3.9[158614]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096934.977174-2674-172012057400886/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:55:36 compute-0 sudo[158612]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:37 compute-0 sudo[158764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkyhtgpecoaqksrcjtzpxvrcvmpriapb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096936.6429975-2704-226414652337957/AnsiballZ_systemd.py'
Nov 25 18:55:37 compute-0 sudo[158764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:37 compute-0 python3.9[158766]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:55:37 compute-0 systemd[1]: Reloading.
Nov 25 18:55:37 compute-0 systemd-rc-local-generator[158792]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:55:37 compute-0 systemd-sysv-generator[158796]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:55:37 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Nov 25 18:55:37 compute-0 sudo[158764]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:38 compute-0 sudo[158955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhxdrwvvudgqcthawbgptgilddjhxqec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096938.0906284-2720-272013533672717/AnsiballZ_systemd.py'
Nov 25 18:55:38 compute-0 sudo[158955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:38 compute-0 python3.9[158957]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 25 18:55:38 compute-0 systemd[1]: Reloading.
Nov 25 18:55:38 compute-0 systemd-sysv-generator[158988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:55:38 compute-0 systemd-rc-local-generator[158983]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:55:39 compute-0 systemd[1]: Reloading.
Nov 25 18:55:39 compute-0 systemd-rc-local-generator[159020]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:55:39 compute-0 systemd-sysv-generator[159025]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:55:39 compute-0 sudo[158955]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:39 compute-0 sshd-session[104483]: Connection closed by 192.168.122.30 port 37754
Nov 25 18:55:39 compute-0 sshd-session[104480]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:55:39 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Nov 25 18:55:39 compute-0 systemd[1]: session-23.scope: Consumed 4min 103ms CPU time.
Nov 25 18:55:39 compute-0 systemd-logind[820]: Session 23 logged out. Waiting for processes to exit.
Nov 25 18:55:39 compute-0 systemd-logind[820]: Removed session 23.
Nov 25 18:55:45 compute-0 sshd-session[159053]: Accepted publickey for zuul from 192.168.122.30 port 32820 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:55:45 compute-0 systemd-logind[820]: New session 24 of user zuul.
Nov 25 18:55:45 compute-0 systemd[1]: Started Session 24 of User zuul.
Nov 25 18:55:45 compute-0 sshd-session[159053]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:55:46 compute-0 python3.9[159206]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:55:47 compute-0 python3.9[159360]: ansible-ansible.builtin.service_facts Invoked
Nov 25 18:55:47 compute-0 network[159377]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 18:55:47 compute-0 network[159378]: 'network-scripts' will be removed from distribution in near future.
Nov 25 18:55:47 compute-0 network[159379]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 18:55:54 compute-0 sudo[159648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeqlalhwdctuziknvuowscyyasgotema ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096953.641419-74-240175743095111/AnsiballZ_setup.py'
Nov 25 18:55:54 compute-0 sudo[159648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:54 compute-0 python3.9[159650]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 25 18:55:54 compute-0 sudo[159648]: pam_unix(sudo:session): session closed for user root
Nov 25 18:55:55 compute-0 sudo[159732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbcwnkoqpxtwinrptkxovrxtseawfdlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096953.641419-74-240175743095111/AnsiballZ_dnf.py'
Nov 25 18:55:55 compute-0 sudo[159732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:55:55 compute-0 python3.9[159734]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:56:00 compute-0 sudo[159732]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:01 compute-0 sudo[159885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haqqhqjnxtyohghusfkjwtfppvpqgqll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096960.5411844-98-100182996656263/AnsiballZ_stat.py'
Nov 25 18:56:01 compute-0 sudo[159885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:01 compute-0 python3.9[159887]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:56:01 compute-0 sudo[159885]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:02 compute-0 sudo[160052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfxznhbjgpxkxfpgqxubrhgikgbizutc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096961.6309223-118-249611026919009/AnsiballZ_command.py'
Nov 25 18:56:02 compute-0 sudo[160052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:02 compute-0 podman[160010]: 2025-11-25 18:56:02.196852083 +0000 UTC m=+0.113555965 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller)
Nov 25 18:56:02 compute-0 python3.9[160058]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:56:02 compute-0 sudo[160052]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:03 compute-0 sudo[160214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkvchcovhrfsofpmeueuijddaimvoqhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096962.7552793-138-81203685252897/AnsiballZ_stat.py'
Nov 25 18:56:03 compute-0 sudo[160214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:03 compute-0 python3.9[160216]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:56:03 compute-0 sudo[160214]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:04 compute-0 sudo[160366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpbsyjtrltjecdqixjedcugwwawqzwaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096963.712282-154-118695390103371/AnsiballZ_command.py'
Nov 25 18:56:04 compute-0 sudo[160366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:04 compute-0 python3.9[160368]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:56:04 compute-0 sudo[160366]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:04 compute-0 sudo[160519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmmrcxqdsecxjxeikdhzippgdvtwsxni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096964.5911465-170-132503299451473/AnsiballZ_stat.py'
Nov 25 18:56:04 compute-0 sudo[160519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:05 compute-0 python3.9[160521]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:05 compute-0 sudo[160519]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:05 compute-0 sudo[160654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzzsypeazofdmcfrsrfetmgakryabpfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096964.5911465-170-132503299451473/AnsiballZ_copy.py'
Nov 25 18:56:05 compute-0 sudo[160654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:05 compute-0 podman[160616]: 2025-11-25 18:56:05.879894668 +0000 UTC m=+0.084215459 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 18:56:06 compute-0 python3.9[160662]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096964.5911465-170-132503299451473/.source.iscsi _original_basename=.2lc4m8j7 follow=False checksum=2130e60c55071058b53c8e0ea55bf53fbad55ed2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:06 compute-0 sudo[160654]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:06 compute-0 sudo[160812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uytmgnkgfbcdzyeephhjricdjjripacr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096966.2899754-200-275814675532014/AnsiballZ_file.py'
Nov 25 18:56:06 compute-0 sudo[160812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:07 compute-0 python3.9[160814]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:07 compute-0 sudo[160812]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:08 compute-0 sudo[160964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwlqjotnfwfszmshezgqghjdacdrrjib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096967.352256-216-62604555415477/AnsiballZ_lineinfile.py'
Nov 25 18:56:08 compute-0 sudo[160964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:08 compute-0 python3.9[160966]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:08 compute-0 sudo[160964]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:08 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 18:56:08 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 18:56:09 compute-0 sudo[161117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suouylljoucuezfdeejyhixuzvdzbnpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096968.5886712-234-189032513620073/AnsiballZ_systemd_service.py'
Nov 25 18:56:09 compute-0 sudo[161117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:09 compute-0 python3.9[161119]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:56:09 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 25 18:56:09 compute-0 sudo[161117]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:10 compute-0 sudo[161273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbtgqjmokcjuzmtifytdmficllqfslda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096970.050337-250-211676360313896/AnsiballZ_systemd_service.py'
Nov 25 18:56:10 compute-0 sudo[161273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:10 compute-0 python3.9[161275]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:56:10 compute-0 systemd[1]: Reloading.
Nov 25 18:56:10 compute-0 systemd-rc-local-generator[161305]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:56:10 compute-0 systemd-sysv-generator[161309]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:56:11 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 25 18:56:11 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 25 18:56:11 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Nov 25 18:56:11 compute-0 systemd[1]: Started Open-iSCSI.
Nov 25 18:56:11 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 25 18:56:11 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 25 18:56:11 compute-0 sudo[161273]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:12 compute-0 sudo[161473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsndndeozuerveppwcildmpcjijnpect ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096971.8107238-272-60068746468024/AnsiballZ_service_facts.py'
Nov 25 18:56:12 compute-0 sudo[161473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:12 compute-0 python3.9[161475]: ansible-ansible.builtin.service_facts Invoked
Nov 25 18:56:12 compute-0 network[161492]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 18:56:12 compute-0 network[161493]: 'network-scripts' will be removed from distribution in near future.
Nov 25 18:56:12 compute-0 network[161494]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 18:56:16 compute-0 sudo[161473]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:17 compute-0 sudo[161764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmmkscwdrjveqntfptcdmsmicmbqnxqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096977.0583944-292-192994672168967/AnsiballZ_file.py'
Nov 25 18:56:17 compute-0 sudo[161764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:17 compute-0 python3.9[161766]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 25 18:56:17 compute-0 sudo[161764]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:18 compute-0 sudo[161916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxmvrkblufgemuegvyomoyhlbjlmvrwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096978.2894006-308-35189180397044/AnsiballZ_modprobe.py'
Nov 25 18:56:18 compute-0 sudo[161916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:19 compute-0 python3.9[161918]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 25 18:56:19 compute-0 sudo[161916]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:19 compute-0 sudo[162072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnnenxkwpxnfelapztmauvhfbkosyzzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096979.304926-324-188337156671629/AnsiballZ_stat.py'
Nov 25 18:56:19 compute-0 sudo[162072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:19 compute-0 python3.9[162074]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:19 compute-0 sudo[162072]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:20 compute-0 sudo[162195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojfkhgojdixnvjirdryttdqoaxsksqpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096979.304926-324-188337156671629/AnsiballZ_copy.py'
Nov 25 18:56:20 compute-0 sudo[162195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:20 compute-0 python3.9[162197]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096979.304926-324-188337156671629/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:20 compute-0 sudo[162195]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:21 compute-0 sudo[162347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksnhfullypwwauwqidkerhswdesaftfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096980.9220216-356-39565311082467/AnsiballZ_lineinfile.py'
Nov 25 18:56:21 compute-0 sudo[162347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:21 compute-0 python3.9[162349]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:21 compute-0 sudo[162347]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:22 compute-0 sudo[162499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqdbofbnhooxssgyaojijvjqvsiywthy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096981.7572715-372-181498421875236/AnsiballZ_systemd.py'
Nov 25 18:56:22 compute-0 sudo[162499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:22 compute-0 python3.9[162501]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:56:22 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 25 18:56:22 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 25 18:56:22 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 25 18:56:22 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 25 18:56:22 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 25 18:56:22 compute-0 sudo[162499]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:23 compute-0 sudo[162655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxisvlnoyzxrjqowltifcwzxjzgnvddx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096983.0947654-388-135551220076472/AnsiballZ_file.py'
Nov 25 18:56:23 compute-0 sudo[162655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:23 compute-0 python3.9[162657]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:56:23 compute-0 sudo[162655]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:24 compute-0 sudo[162807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmaqncujftaqrlpnmzuxamefrfsfsuoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096984.1828449-406-205158460897072/AnsiballZ_stat.py'
Nov 25 18:56:24 compute-0 sudo[162807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:24 compute-0 python3.9[162809]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:56:24 compute-0 sudo[162807]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:25 compute-0 sudo[162959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psubqsccldrfefarxwdojypyusnqmsft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096985.070847-424-204856255773357/AnsiballZ_stat.py'
Nov 25 18:56:25 compute-0 sudo[162959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:25 compute-0 python3.9[162961]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:56:25 compute-0 sudo[162959]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:26 compute-0 sudo[163111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlvvukgwpcegfqyymizihptrcpqeugza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096985.8738666-440-180041804373630/AnsiballZ_stat.py'
Nov 25 18:56:26 compute-0 sudo[163111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:26 compute-0 python3.9[163113]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:26 compute-0 sudo[163111]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:26 compute-0 sudo[163234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noreqenaqgeskxgsyauglatjgnfxdcpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096985.8738666-440-180041804373630/AnsiballZ_copy.py'
Nov 25 18:56:26 compute-0 sudo[163234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:27 compute-0 python3.9[163236]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764096985.8738666-440-180041804373630/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:27 compute-0 sudo[163234]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:27 compute-0 sudo[163386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjglrdamjwyctdnooxlqzgvctjstnngz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096987.4037638-470-191283742841921/AnsiballZ_command.py'
Nov 25 18:56:27 compute-0 sudo[163386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:28 compute-0 python3.9[163388]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:56:28 compute-0 sudo[163386]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:28 compute-0 sudo[163539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-napxzmoskjiucfipcepppxpymugptpft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096988.2404761-486-175381109903944/AnsiballZ_lineinfile.py'
Nov 25 18:56:28 compute-0 sudo[163539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:28 compute-0 python3.9[163541]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:28 compute-0 sudo[163539]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:29 compute-0 sudo[163691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nklpiwjfowkcvtszjckonblsdefstjvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096989.1066196-502-104199448768537/AnsiballZ_replace.py'
Nov 25 18:56:29 compute-0 sudo[163691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:29 compute-0 python3.9[163693]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:29 compute-0 sudo[163691]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:30 compute-0 sudo[163843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bitqbqletuyvhwkkibjcuzkeswaljxkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096990.1720893-518-169103585727165/AnsiballZ_replace.py'
Nov 25 18:56:30 compute-0 sudo[163843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:30 compute-0 python3.9[163845]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:30 compute-0 sudo[163843]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:56:31.040 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:56:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:56:31.042 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:56:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:56:31.042 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:56:31 compute-0 sudo[163996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btchycmleuhiahyzngdvmigbreskvqyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096991.067155-536-96718147056603/AnsiballZ_lineinfile.py'
Nov 25 18:56:31 compute-0 sudo[163996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:31 compute-0 python3.9[163998]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:31 compute-0 sudo[163996]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:32 compute-0 sudo[164148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyjhslqmymcbupauayxccblinpquloyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096991.8408346-536-197041300053176/AnsiballZ_lineinfile.py'
Nov 25 18:56:32 compute-0 sudo[164148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:32 compute-0 python3.9[164150]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:32 compute-0 sudo[164148]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:32 compute-0 sudo[164312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ximpyieiehhwvsamiftsddowhvtqbudg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096992.5364459-536-128669394719717/AnsiballZ_lineinfile.py'
Nov 25 18:56:32 compute-0 sudo[164312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:33 compute-0 podman[164274]: 2025-11-25 18:56:33.066293034 +0000 UTC m=+0.192706961 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 18:56:33 compute-0 sshd-session[164325]: Connection closed by 159.65.25.171 port 47316
Nov 25 18:56:33 compute-0 python3.9[164320]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:33 compute-0 sudo[164312]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:33 compute-0 sudo[164479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqcrtsqnqjomtstptkguxckjzhofjmby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096993.376575-536-100878951055753/AnsiballZ_lineinfile.py'
Nov 25 18:56:33 compute-0 sudo[164479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:34 compute-0 python3.9[164481]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:34 compute-0 sudo[164479]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:34 compute-0 sudo[164631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysagyrrawtldifsgjgynfzfrzsfnyjft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096994.2755911-594-113917709822238/AnsiballZ_stat.py'
Nov 25 18:56:34 compute-0 sudo[164631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:34 compute-0 python3.9[164633]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:56:34 compute-0 sudo[164631]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:35 compute-0 sudo[164785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzmgmvgmstppfxdrfkhdiflpiwcuhulv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096995.2716146-610-87277125171481/AnsiballZ_file.py'
Nov 25 18:56:35 compute-0 sudo[164785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:35 compute-0 python3.9[164787]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:35 compute-0 sudo[164785]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:36 compute-0 podman[164812]: 2025-11-25 18:56:36.188278636 +0000 UTC m=+0.101884981 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251125)
Nov 25 18:56:36 compute-0 sudo[164953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ettirfepgeatabwurvcrojjjowwotyze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096996.1585402-628-161263060161177/AnsiballZ_file.py'
Nov 25 18:56:36 compute-0 sudo[164953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:36 compute-0 python3.9[164955]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:56:36 compute-0 sudo[164953]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:37 compute-0 sudo[165105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycgjzkxpzrdkhtwlaglyawaluhnuukzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096997.0656483-644-77540406440940/AnsiballZ_stat.py'
Nov 25 18:56:37 compute-0 sudo[165105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:37 compute-0 python3.9[165107]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:37 compute-0 sudo[165105]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:38 compute-0 sudo[165183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bppvopnyeprhkpkrslpfqqsohnbpquwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096997.0656483-644-77540406440940/AnsiballZ_file.py'
Nov 25 18:56:38 compute-0 sudo[165183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:38 compute-0 python3.9[165185]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:56:38 compute-0 sudo[165183]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:38 compute-0 sudo[165335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stuavscloxrjistzuctvwsfrzkyohyfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096998.4473236-644-151075202996537/AnsiballZ_stat.py'
Nov 25 18:56:38 compute-0 sudo[165335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:39 compute-0 python3.9[165337]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:39 compute-0 sudo[165335]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:39 compute-0 sudo[165413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apppjuusbaksfyrjnctvgkosqjeitnzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096998.4473236-644-151075202996537/AnsiballZ_file.py'
Nov 25 18:56:39 compute-0 sudo[165413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:39 compute-0 python3.9[165415]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:56:39 compute-0 sudo[165413]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:40 compute-0 sudo[165565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdzpfcxjbmlizfoquujadybspjedvmeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764096999.9495895-690-234206375962519/AnsiballZ_file.py'
Nov 25 18:56:40 compute-0 sudo[165565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:40 compute-0 python3.9[165567]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:40 compute-0 sudo[165565]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:41 compute-0 sudo[165717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfcifnetspzqsonvcozvliwqctkmabyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097000.8166282-706-39017345669205/AnsiballZ_stat.py'
Nov 25 18:56:41 compute-0 sudo[165717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:41 compute-0 python3.9[165719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:41 compute-0 sudo[165717]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:41 compute-0 sudo[165795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmpovbjqrwyyyzupigpketozvrwyafhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097000.8166282-706-39017345669205/AnsiballZ_file.py'
Nov 25 18:56:41 compute-0 sudo[165795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:41 compute-0 python3.9[165797]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:41 compute-0 sudo[165795]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:42 compute-0 sudo[165947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juwgeeqyujgvrzqihdxfmzaavbtoxybd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097002.2729218-730-43939257381892/AnsiballZ_stat.py'
Nov 25 18:56:42 compute-0 sudo[165947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:42 compute-0 python3.9[165949]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:42 compute-0 sudo[165947]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:43 compute-0 sudo[166025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmpymegodhcwslcaqlhmisgobgqwyuoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097002.2729218-730-43939257381892/AnsiballZ_file.py'
Nov 25 18:56:43 compute-0 sudo[166025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:43 compute-0 python3.9[166027]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:43 compute-0 sudo[166025]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:44 compute-0 sudo[166177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rodxycsmvdfghcqsttcchoafyolinwpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097003.683136-754-158980778429136/AnsiballZ_systemd.py'
Nov 25 18:56:44 compute-0 sudo[166177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:44 compute-0 python3.9[166179]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:56:44 compute-0 systemd[1]: Reloading.
Nov 25 18:56:44 compute-0 systemd-rc-local-generator[166208]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:56:44 compute-0 systemd-sysv-generator[166212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:56:44 compute-0 sudo[166177]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:45 compute-0 sudo[166366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkfzlziiisoasmpetiagbynmlztuiyan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097005.0726366-770-194430849056825/AnsiballZ_stat.py'
Nov 25 18:56:45 compute-0 sudo[166366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:45 compute-0 python3.9[166368]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:45 compute-0 sudo[166366]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:46 compute-0 sudo[166444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrlttuikxvgfbnekloekvykkemubskuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097005.0726366-770-194430849056825/AnsiballZ_file.py'
Nov 25 18:56:46 compute-0 sudo[166444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:46 compute-0 python3.9[166446]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:46 compute-0 sudo[166444]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:47 compute-0 sudo[166596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-katlczgesktdrpxvfdabaoahypthkdlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097006.6422706-794-36250709411012/AnsiballZ_stat.py'
Nov 25 18:56:47 compute-0 sudo[166596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:47 compute-0 python3.9[166598]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:47 compute-0 sudo[166596]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:47 compute-0 sudo[166674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afokqilfdvexlwqhctvjwswukuweunkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097006.6422706-794-36250709411012/AnsiballZ_file.py'
Nov 25 18:56:47 compute-0 sudo[166674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:47 compute-0 python3.9[166676]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:47 compute-0 sudo[166674]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:48 compute-0 sudo[166826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixadcxgimkkhdtpymnvleixnqshhyhbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097008.1455503-818-213003797474757/AnsiballZ_systemd.py'
Nov 25 18:56:48 compute-0 sudo[166826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:49 compute-0 python3.9[166828]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:56:49 compute-0 systemd[1]: Reloading.
Nov 25 18:56:49 compute-0 systemd-rc-local-generator[166857]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:56:49 compute-0 systemd-sysv-generator[166860]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:56:49 compute-0 systemd[1]: Starting Create netns directory...
Nov 25 18:56:49 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 25 18:56:49 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 25 18:56:49 compute-0 systemd[1]: Finished Create netns directory.
Nov 25 18:56:49 compute-0 sudo[166826]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:50 compute-0 sudo[167022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpeuztuxymtydxjhayfmehjghrdbzqof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097009.8475027-838-181606681830905/AnsiballZ_file.py'
Nov 25 18:56:50 compute-0 sudo[167022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:50 compute-0 python3.9[167024]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:56:50 compute-0 sudo[167022]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:51 compute-0 sudo[167174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsplktsuhxnsgirtyvmkubrzsaokdbcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097010.7160265-854-267779994907927/AnsiballZ_stat.py'
Nov 25 18:56:51 compute-0 sudo[167174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:51 compute-0 python3.9[167176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:51 compute-0 sudo[167174]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:51 compute-0 sudo[167297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydqkdhxuiohdeyklgnshdyamfwhkyiui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097010.7160265-854-267779994907927/AnsiballZ_copy.py'
Nov 25 18:56:51 compute-0 sudo[167297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:51 compute-0 python3.9[167299]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097010.7160265-854-267779994907927/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:56:52 compute-0 sudo[167297]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:52 compute-0 sudo[167449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etuhomxagmfrfhlgspixryvgolxkdvtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097012.5842843-888-22616984899497/AnsiballZ_file.py'
Nov 25 18:56:52 compute-0 sudo[167449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:53 compute-0 python3.9[167451]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:56:53 compute-0 sudo[167449]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:53 compute-0 sudo[167601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orwmdcwgpilmdwjixelavpvvbhdacqhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097013.5083425-904-42888542135655/AnsiballZ_stat.py'
Nov 25 18:56:53 compute-0 sudo[167601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:54 compute-0 python3.9[167603]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:56:54 compute-0 sudo[167601]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:54 compute-0 sudo[167724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odmkmfkxlzxgpzxcyflggifrhfoubhpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097013.5083425-904-42888542135655/AnsiballZ_copy.py'
Nov 25 18:56:54 compute-0 sudo[167724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:54 compute-0 python3.9[167726]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097013.5083425-904-42888542135655/.source.json _original_basename=.7w9bjnu1 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:54 compute-0 sudo[167724]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:55 compute-0 sudo[167876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfmkontcyeuylsdickipzzdrennitpen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097015.0736444-934-20244368784249/AnsiballZ_file.py'
Nov 25 18:56:55 compute-0 sudo[167876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:55 compute-0 python3.9[167878]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:56:55 compute-0 sudo[167876]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:56 compute-0 sudo[168028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obfmyxewpfcojgjqqoafvyjcbedsruun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097015.987932-950-272700147761668/AnsiballZ_stat.py'
Nov 25 18:56:56 compute-0 sudo[168028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:56 compute-0 sudo[168028]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:57 compute-0 sudo[168151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lualloiaseyifzknzcefuxwvgmowvyuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097015.987932-950-272700147761668/AnsiballZ_copy.py'
Nov 25 18:56:57 compute-0 sudo[168151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:57 compute-0 sudo[168151]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:58 compute-0 sudo[168303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffwokcziupmykvyrvikrrxrdovalgxfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097017.7716646-984-58110833964562/AnsiballZ_container_config_data.py'
Nov 25 18:56:58 compute-0 sudo[168303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:58 compute-0 python3.9[168305]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 25 18:56:58 compute-0 sudo[168303]: pam_unix(sudo:session): session closed for user root
Nov 25 18:56:59 compute-0 sudo[168455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifuuahfmielliavrhazwawnjejdqvwdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097018.9768965-1002-194897310004488/AnsiballZ_container_config_hash.py'
Nov 25 18:56:59 compute-0 sudo[168455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:56:59 compute-0 python3.9[168457]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 18:56:59 compute-0 sudo[168455]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:00 compute-0 sudo[168607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygixwtzdsbqutsusaluthzpszykurplc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097020.217101-1020-158262797015222/AnsiballZ_podman_container_info.py'
Nov 25 18:57:00 compute-0 sudo[168607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:00 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 25 18:57:01 compute-0 python3.9[168609]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 25 18:57:01 compute-0 sudo[168607]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:02 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 25 18:57:02 compute-0 sudo[168788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmfwbzzwwzrfnipfbjzpobmpbzlrbmbb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764097022.095181-1046-40364921108369/AnsiballZ_edpm_container_manage.py'
Nov 25 18:57:02 compute-0 sudo[168788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:03 compute-0 python3[168790]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 18:57:03 compute-0 podman[168826]: 2025-11-25 18:57:03.230139324 +0000 UTC m=+0.026884070 image pull 3b9623fd19bd3aa77b0b5fd336125d3125adff84d7957abb18fcf4bd44d404d6 38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Nov 25 18:57:03 compute-0 podman[168826]: 2025-11-25 18:57:03.387591201 +0000 UTC m=+0.184335907 container create 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 25 18:57:03 compute-0 python3[168790]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Nov 25 18:57:03 compute-0 sudo[168788]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:03 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 25 18:57:03 compute-0 podman[168890]: 2025-11-25 18:57:03.964298001 +0000 UTC m=+0.158386863 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 18:57:04 compute-0 sudo[169040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecgtjugjadwlqhhkbtquordzkipagiwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097023.8239014-1062-140698088169664/AnsiballZ_stat.py'
Nov 25 18:57:04 compute-0 sudo[169040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:04 compute-0 python3.9[169042]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:57:04 compute-0 sudo[169040]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:05 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 25 18:57:05 compute-0 sudo[169195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okgjuopeclefyftvlzicsqhkpszspkqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097024.8871233-1080-276609495281437/AnsiballZ_file.py'
Nov 25 18:57:05 compute-0 sudo[169195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:05 compute-0 python3.9[169197]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:05 compute-0 sudo[169195]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:05 compute-0 sudo[169271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nksflxxhijcuyizkvpllxmgnfiriqqay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097024.8871233-1080-276609495281437/AnsiballZ_stat.py'
Nov 25 18:57:05 compute-0 sudo[169271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:06 compute-0 python3.9[169273]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:57:06 compute-0 sudo[169271]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:06 compute-0 podman[169396]: 2025-11-25 18:57:06.704342828 +0000 UTC m=+0.077631505 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 18:57:06 compute-0 sudo[169437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-airkpyihfzqkoqwcslwuyshfhsuzhvxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097026.1135406-1080-139500235453444/AnsiballZ_copy.py'
Nov 25 18:57:06 compute-0 sudo[169437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:06 compute-0 python3.9[169443]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764097026.1135406-1080-139500235453444/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:06 compute-0 sudo[169437]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:07 compute-0 sudo[169517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roeuwlhsjczklgkcfzumjboeaizajixp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097026.1135406-1080-139500235453444/AnsiballZ_systemd.py'
Nov 25 18:57:07 compute-0 sudo[169517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:07 compute-0 python3.9[169519]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:57:07 compute-0 systemd[1]: Reloading.
Nov 25 18:57:07 compute-0 systemd-rc-local-generator[169547]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:57:07 compute-0 systemd-sysv-generator[169550]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:57:07 compute-0 sudo[169517]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:08 compute-0 sudo[169629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvrqejpzbeonhensbdvovogukrrecaos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097026.1135406-1080-139500235453444/AnsiballZ_systemd.py'
Nov 25 18:57:08 compute-0 sudo[169629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:08 compute-0 python3.9[169631]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:57:09 compute-0 systemd[1]: Reloading.
Nov 25 18:57:09 compute-0 systemd-rc-local-generator[169659]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:57:09 compute-0 systemd-sysv-generator[169663]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:57:10 compute-0 systemd[1]: Starting multipathd container...
Nov 25 18:57:10 compute-0 systemd[1]: Started libcrun container.
Nov 25 18:57:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3942b02a50396e3d5899d88257e4d89eaa071f60203c956126ff4fa83230c0c7/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 18:57:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3942b02a50396e3d5899d88257e4d89eaa071f60203c956126ff4fa83230c0c7/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 18:57:10 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562.
Nov 25 18:57:10 compute-0 podman[169670]: 2025-11-25 18:57:10.459999723 +0000 UTC m=+0.414767794 container init 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 18:57:10 compute-0 multipathd[169685]: + sudo -E kolla_set_configs
Nov 25 18:57:10 compute-0 podman[169670]: 2025-11-25 18:57:10.496734425 +0000 UTC m=+0.451502456 container start 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 25 18:57:10 compute-0 sudo[169691]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 25 18:57:10 compute-0 sudo[169691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 25 18:57:10 compute-0 podman[169670]: multipathd
Nov 25 18:57:10 compute-0 systemd[1]: Started multipathd container.
Nov 25 18:57:10 compute-0 multipathd[169685]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 18:57:10 compute-0 multipathd[169685]: INFO:__main__:Validating config file
Nov 25 18:57:10 compute-0 multipathd[169685]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 18:57:10 compute-0 multipathd[169685]: INFO:__main__:Writing out command to execute
Nov 25 18:57:10 compute-0 sudo[169691]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:10 compute-0 multipathd[169685]: ++ cat /run_command
Nov 25 18:57:10 compute-0 multipathd[169685]: + CMD='/usr/sbin/multipathd -d'
Nov 25 18:57:10 compute-0 multipathd[169685]: + ARGS=
Nov 25 18:57:10 compute-0 multipathd[169685]: + sudo kolla_copy_cacerts
Nov 25 18:57:10 compute-0 sudo[169629]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:10 compute-0 sudo[169713]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 25 18:57:10 compute-0 sudo[169713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 25 18:57:10 compute-0 podman[169692]: 2025-11-25 18:57:10.615759265 +0000 UTC m=+0.100136917 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 25 18:57:10 compute-0 sudo[169713]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:10 compute-0 multipathd[169685]: Running command: '/usr/sbin/multipathd -d'
Nov 25 18:57:10 compute-0 multipathd[169685]: + [[ ! -n '' ]]
Nov 25 18:57:10 compute-0 multipathd[169685]: + . kolla_extend_start
Nov 25 18:57:10 compute-0 multipathd[169685]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 25 18:57:10 compute-0 multipathd[169685]: + umask 0022
Nov 25 18:57:10 compute-0 multipathd[169685]: + exec /usr/sbin/multipathd -d
Nov 25 18:57:10 compute-0 systemd[1]: 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562-1795709f89e35254.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 18:57:10 compute-0 systemd[1]: 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562-1795709f89e35254.service: Failed with result 'exit-code'.
Nov 25 18:57:10 compute-0 multipathd[169685]: 2978.358729 | multipathd v0.9.9: start up
Nov 25 18:57:10 compute-0 multipathd[169685]: 2978.368768 | reconfigure: setting up paths and maps
Nov 25 18:57:10 compute-0 multipathd[169685]: 2978.399096 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Nov 25 18:57:10 compute-0 multipathd[169685]: 2978.404976 | updated bindings file /etc/multipath/bindings
Nov 25 18:57:11 compute-0 python3.9[169874]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:57:12 compute-0 sudo[170026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbozzcmqywwndyyrfgekylgsnyxmnzmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097031.673623-1152-178486649045084/AnsiballZ_command.py'
Nov 25 18:57:12 compute-0 sudo[170026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:12 compute-0 python3.9[170028]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:57:12 compute-0 sudo[170026]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:13 compute-0 sudo[170191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyrzbgfagzrsumaccnjujpplthwktfzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097032.6209843-1168-275063245329768/AnsiballZ_systemd.py'
Nov 25 18:57:13 compute-0 sudo[170191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:13 compute-0 python3.9[170193]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:57:13 compute-0 systemd[1]: Stopping multipathd container...
Nov 25 18:57:13 compute-0 multipathd[169685]: 2981.190719 | multipathd: shut down
Nov 25 18:57:13 compute-0 systemd[1]: libpod-1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562.scope: Deactivated successfully.
Nov 25 18:57:13 compute-0 podman[170197]: 2025-11-25 18:57:13.502098691 +0000 UTC m=+0.093205611 container died 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4)
Nov 25 18:57:13 compute-0 systemd[1]: 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562-1795709f89e35254.timer: Deactivated successfully.
Nov 25 18:57:13 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562.
Nov 25 18:57:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562-userdata-shm.mount: Deactivated successfully.
Nov 25 18:57:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-3942b02a50396e3d5899d88257e4d89eaa071f60203c956126ff4fa83230c0c7-merged.mount: Deactivated successfully.
Nov 25 18:57:13 compute-0 podman[170197]: 2025-11-25 18:57:13.57055361 +0000 UTC m=+0.161660530 container cleanup 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Nov 25 18:57:13 compute-0 podman[170197]: multipathd
Nov 25 18:57:13 compute-0 podman[170226]: multipathd
Nov 25 18:57:13 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 25 18:57:13 compute-0 systemd[1]: Stopped multipathd container.
Nov 25 18:57:13 compute-0 systemd[1]: Starting multipathd container...
Nov 25 18:57:13 compute-0 systemd[1]: Started libcrun container.
Nov 25 18:57:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3942b02a50396e3d5899d88257e4d89eaa071f60203c956126ff4fa83230c0c7/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 18:57:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3942b02a50396e3d5899d88257e4d89eaa071f60203c956126ff4fa83230c0c7/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 18:57:13 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562.
Nov 25 18:57:13 compute-0 podman[170239]: 2025-11-25 18:57:13.833853926 +0000 UTC m=+0.136798456 container init 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Nov 25 18:57:13 compute-0 multipathd[170254]: + sudo -E kolla_set_configs
Nov 25 18:57:13 compute-0 podman[170239]: 2025-11-25 18:57:13.867804453 +0000 UTC m=+0.170748903 container start 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Nov 25 18:57:13 compute-0 sudo[170260]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 25 18:57:13 compute-0 podman[170239]: multipathd
Nov 25 18:57:13 compute-0 sudo[170260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 25 18:57:13 compute-0 systemd[1]: Started multipathd container.
Nov 25 18:57:13 compute-0 sudo[170191]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:13 compute-0 multipathd[170254]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 18:57:13 compute-0 multipathd[170254]: INFO:__main__:Validating config file
Nov 25 18:57:13 compute-0 multipathd[170254]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 18:57:13 compute-0 multipathd[170254]: INFO:__main__:Writing out command to execute
Nov 25 18:57:13 compute-0 sudo[170260]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:13 compute-0 multipathd[170254]: ++ cat /run_command
Nov 25 18:57:13 compute-0 multipathd[170254]: + CMD='/usr/sbin/multipathd -d'
Nov 25 18:57:13 compute-0 multipathd[170254]: + ARGS=
Nov 25 18:57:13 compute-0 multipathd[170254]: + sudo kolla_copy_cacerts
Nov 25 18:57:13 compute-0 sudo[170283]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 25 18:57:13 compute-0 sudo[170283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 25 18:57:13 compute-0 sudo[170283]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:13 compute-0 multipathd[170254]: + [[ ! -n '' ]]
Nov 25 18:57:13 compute-0 multipathd[170254]: + . kolla_extend_start
Nov 25 18:57:13 compute-0 multipathd[170254]: Running command: '/usr/sbin/multipathd -d'
Nov 25 18:57:13 compute-0 multipathd[170254]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 25 18:57:13 compute-0 multipathd[170254]: + umask 0022
Nov 25 18:57:13 compute-0 multipathd[170254]: + exec /usr/sbin/multipathd -d
Nov 25 18:57:13 compute-0 podman[170261]: 2025-11-25 18:57:13.993284486 +0000 UTC m=+0.107728799 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 18:57:14 compute-0 systemd[1]: 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562-1bf9fdfac6e7ab4f.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 18:57:14 compute-0 systemd[1]: 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562-1bf9fdfac6e7ab4f.service: Failed with result 'exit-code'.
Nov 25 18:57:14 compute-0 multipathd[170254]: 2981.731163 | multipathd v0.9.9: start up
Nov 25 18:57:14 compute-0 multipathd[170254]: 2981.742330 | reconfigure: setting up paths and maps
Nov 25 18:57:14 compute-0 sudo[170443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wohezeptxnlabpldmneomzgydcojhfas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097034.1977072-1184-86570504869830/AnsiballZ_file.py'
Nov 25 18:57:14 compute-0 sudo[170443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:14 compute-0 python3.9[170445]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:14 compute-0 sudo[170443]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:15 compute-0 sudo[170595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eydnlvyjyisnvpuaasyakxvkjpqzqilc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097035.3426793-1208-52851238694023/AnsiballZ_file.py'
Nov 25 18:57:15 compute-0 sudo[170595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:15 compute-0 python3.9[170597]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 25 18:57:15 compute-0 sudo[170595]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:16 compute-0 sudo[170747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noqypydsgtfhevptoplbqacbqqtzycex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097036.1759381-1224-246070667410094/AnsiballZ_modprobe.py'
Nov 25 18:57:16 compute-0 sudo[170747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:16 compute-0 python3.9[170749]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 25 18:57:16 compute-0 kernel: Key type psk registered
Nov 25 18:57:16 compute-0 sudo[170747]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:17 compute-0 sudo[170908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hixzzltzivglkjamhtqakrgkajsyfpxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097037.158269-1240-192153011606457/AnsiballZ_stat.py'
Nov 25 18:57:17 compute-0 sudo[170908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:17 compute-0 python3.9[170910]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:57:17 compute-0 sudo[170908]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:18 compute-0 sudo[171031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnsxhaqvsdqcyelcgdmovoeuxogrejbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097037.158269-1240-192153011606457/AnsiballZ_copy.py'
Nov 25 18:57:18 compute-0 sudo[171031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:18 compute-0 python3.9[171033]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097037.158269-1240-192153011606457/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:18 compute-0 sudo[171031]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:20 compute-0 sudo[171183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdmfcvafuecyquyhjodjcgfsxkifyrpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097040.5200455-1272-120711436922576/AnsiballZ_lineinfile.py'
Nov 25 18:57:20 compute-0 sudo[171183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:21 compute-0 python3.9[171185]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:21 compute-0 sudo[171183]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:21 compute-0 sudo[171335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvdpfworylbhqtuucapwcwqjoqykozna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097041.356901-1288-134398375194325/AnsiballZ_systemd.py'
Nov 25 18:57:21 compute-0 sudo[171335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:22 compute-0 python3.9[171337]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:57:22 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 25 18:57:22 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 25 18:57:22 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 25 18:57:22 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 25 18:57:22 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 25 18:57:22 compute-0 sudo[171335]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:22 compute-0 sudo[171491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpvfqpmohcomipqqphabgqedyekivgik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097042.5599139-1304-164259734549455/AnsiballZ_dnf.py'
Nov 25 18:57:22 compute-0 sudo[171491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:23 compute-0 python3.9[171493]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 25 18:57:25 compute-0 systemd[1]: Reloading.
Nov 25 18:57:25 compute-0 systemd-sysv-generator[171532]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:57:25 compute-0 systemd-rc-local-generator[171528]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:57:26 compute-0 systemd[1]: Reloading.
Nov 25 18:57:26 compute-0 systemd-sysv-generator[171565]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:57:26 compute-0 systemd-rc-local-generator[171561]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:57:26 compute-0 systemd-logind[820]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 25 18:57:26 compute-0 systemd-logind[820]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 25 18:57:26 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 25 18:57:26 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 25 18:57:26 compute-0 systemd[1]: Reloading.
Nov 25 18:57:26 compute-0 systemd-rc-local-generator[171657]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:57:26 compute-0 systemd-sysv-generator[171662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:57:27 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 25 18:57:27 compute-0 sudo[171491]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:28 compute-0 sudo[172726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwcumrwqzgqjkvszebpdyyukhsehnumn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097047.788468-1320-145101144775496/AnsiballZ_systemd_service.py'
Nov 25 18:57:28 compute-0 sudo[172726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:28 compute-0 python3.9[172752]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:57:28 compute-0 systemd[1]: Stopping Open-iSCSI...
Nov 25 18:57:28 compute-0 iscsid[161315]: iscsid shutting down.
Nov 25 18:57:28 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Nov 25 18:57:28 compute-0 systemd[1]: Stopped Open-iSCSI.
Nov 25 18:57:28 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 25 18:57:28 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 25 18:57:28 compute-0 systemd[1]: Started Open-iSCSI.
Nov 25 18:57:28 compute-0 sudo[172726]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 25 18:57:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 25 18:57:28 compute-0 systemd[1]: man-db-cache-update.service: Consumed 2.190s CPU time.
Nov 25 18:57:28 compute-0 systemd[1]: run-rc1698e8cdce44af9b1e6e4c631a29bc3.service: Deactivated successfully.
Nov 25 18:57:29 compute-0 python3.9[173103]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:57:30 compute-0 sudo[173257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zchjvpzehtlucbxpnfpivnaevgbpvrjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097050.220801-1355-33990476953400/AnsiballZ_file.py'
Nov 25 18:57:30 compute-0 sudo[173257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:30 compute-0 python3.9[173259]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:30 compute-0 sudo[173257]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:57:31.044 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:57:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:57:31.046 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:57:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:57:31.046 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:57:31 compute-0 sudo[173410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgodmrjuahtxdynbnksxfexdhmlwdeim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097051.286262-1377-201928828813281/AnsiballZ_systemd_service.py'
Nov 25 18:57:31 compute-0 sudo[173410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:32 compute-0 python3.9[173412]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:57:32 compute-0 systemd[1]: Reloading.
Nov 25 18:57:32 compute-0 systemd-rc-local-generator[173440]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:57:32 compute-0 systemd-sysv-generator[173443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:57:32 compute-0 sudo[173410]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:33 compute-0 python3.9[173597]: ansible-ansible.builtin.service_facts Invoked
Nov 25 18:57:33 compute-0 network[173614]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 18:57:33 compute-0 network[173615]: 'network-scripts' will be removed from distribution in near future.
Nov 25 18:57:33 compute-0 network[173616]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 18:57:34 compute-0 podman[173622]: 2025-11-25 18:57:34.319240365 +0000 UTC m=+0.143911778 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Nov 25 18:57:36 compute-0 podman[173743]: 2025-11-25 18:57:36.844867538 +0000 UTC m=+0.086619304 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 18:57:39 compute-0 sudo[173931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nadwrftojervsdudfkeuszdqsmndyieh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097059.218039-1415-134442421349510/AnsiballZ_systemd_service.py'
Nov 25 18:57:39 compute-0 sudo[173931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:39 compute-0 python3.9[173933]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:57:40 compute-0 sudo[173931]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:40 compute-0 sudo[174084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlsfwkbkzyvrbujfxyhyrobwchgifypj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097060.2153366-1415-154008060055196/AnsiballZ_systemd_service.py'
Nov 25 18:57:40 compute-0 sudo[174084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:40 compute-0 python3.9[174086]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:57:41 compute-0 sudo[174084]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:41 compute-0 sudo[174237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtoeooazsoyobdzympdnmlmbdykdeehp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097061.1924624-1415-64687687386762/AnsiballZ_systemd_service.py'
Nov 25 18:57:41 compute-0 sudo[174237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:41 compute-0 python3.9[174239]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:57:41 compute-0 sudo[174237]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:42 compute-0 sudo[174390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvtmfebzgojstrdaitnmczmxdpjlukoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097062.1652505-1415-133509092889677/AnsiballZ_systemd_service.py'
Nov 25 18:57:42 compute-0 sudo[174390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:42 compute-0 python3.9[174392]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:57:42 compute-0 sudo[174390]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:43 compute-0 sudo[174543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtwclspyhajyzlfcavzgbetundusxjwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097063.1610472-1415-158414894681224/AnsiballZ_systemd_service.py'
Nov 25 18:57:43 compute-0 sudo[174543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:43 compute-0 python3.9[174545]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:57:43 compute-0 sudo[174543]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:44 compute-0 podman[174594]: 2025-11-25 18:57:44.157590353 +0000 UTC m=+0.079817332 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Nov 25 18:57:44 compute-0 sudo[174718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dltvzaifvivccdbeglqijwmphonztive ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097064.0608175-1415-225656275293984/AnsiballZ_systemd_service.py'
Nov 25 18:57:44 compute-0 sudo[174718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:44 compute-0 python3.9[174720]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:57:44 compute-0 sudo[174718]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:45 compute-0 sudo[174871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcdecwbjtqpybocmwbweiaoxehutwpij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097065.009213-1415-176704806523618/AnsiballZ_systemd_service.py'
Nov 25 18:57:45 compute-0 sudo[174871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:45 compute-0 python3.9[174873]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:57:45 compute-0 sudo[174871]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:46 compute-0 sudo[175024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnjfgmyowbscmlbkfbxoijscyiceltkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097065.9195373-1415-239903134875939/AnsiballZ_systemd_service.py'
Nov 25 18:57:46 compute-0 sudo[175024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:46 compute-0 python3.9[175026]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:57:46 compute-0 sudo[175024]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:47 compute-0 sudo[175177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nepazezunussfwatlewktactatkvmgmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097067.19229-1533-21027357448247/AnsiballZ_file.py'
Nov 25 18:57:47 compute-0 sudo[175177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:47 compute-0 python3.9[175179]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:47 compute-0 sudo[175177]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:48 compute-0 sudo[175329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzfnrtjtkxpxsesflhusirbwoigcqvta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097068.0216632-1533-157707513588467/AnsiballZ_file.py'
Nov 25 18:57:48 compute-0 sudo[175329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:48 compute-0 python3.9[175331]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:48 compute-0 sudo[175329]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:49 compute-0 sudo[175481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvsmfokrtqltdymkgivxjalfnfktsxut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097068.8050528-1533-43002502019073/AnsiballZ_file.py'
Nov 25 18:57:49 compute-0 sudo[175481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:49 compute-0 python3.9[175483]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:49 compute-0 sudo[175481]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:49 compute-0 sudo[175633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biqlkspadagezukrtiffcpwekogxpvpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097069.4853337-1533-184987300269771/AnsiballZ_file.py'
Nov 25 18:57:49 compute-0 sudo[175633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:50 compute-0 python3.9[175635]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:50 compute-0 sudo[175633]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:50 compute-0 sudo[175785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfuhttrfmkwjzuamxeghqhuinvmbswlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097070.2751534-1533-241397138119159/AnsiballZ_file.py'
Nov 25 18:57:50 compute-0 sudo[175785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:50 compute-0 python3.9[175787]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:50 compute-0 sudo[175785]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:51 compute-0 sudo[175937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytvjqfyliguciucgefcsddujcjrisurx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097071.071857-1533-271075173876114/AnsiballZ_file.py'
Nov 25 18:57:51 compute-0 sudo[175937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:51 compute-0 python3.9[175939]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:51 compute-0 sudo[175937]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:52 compute-0 sudo[176089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njahcvxbfddxjciisrlmrwywczbaibzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097071.9008553-1533-4901798890953/AnsiballZ_file.py'
Nov 25 18:57:52 compute-0 sudo[176089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:52 compute-0 python3.9[176091]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:52 compute-0 sudo[176089]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:53 compute-0 sudo[176241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbymsgucurhnehtfxtlilcgrsgwuwovr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097072.6928763-1533-157448793323928/AnsiballZ_file.py'
Nov 25 18:57:53 compute-0 sudo[176241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:53 compute-0 python3.9[176243]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:53 compute-0 sudo[176241]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:53 compute-0 sudo[176393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdrfvfyolzsffpgfoqttkkgyjsbnnhze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097073.4856586-1647-41166209365165/AnsiballZ_file.py'
Nov 25 18:57:53 compute-0 sudo[176393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:54 compute-0 python3.9[176395]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:54 compute-0 sudo[176393]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:54 compute-0 sudo[176545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewwrnqsqqtlddwligyrxlddbtfsnketc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097074.2529376-1647-264194901772452/AnsiballZ_file.py'
Nov 25 18:57:54 compute-0 sudo[176545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:54 compute-0 python3.9[176547]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:54 compute-0 sudo[176545]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:55 compute-0 sudo[176697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvjqrregswpopwjqzivmquuhgibpbdtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097075.0352974-1647-26231584320455/AnsiballZ_file.py'
Nov 25 18:57:55 compute-0 sudo[176697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:55 compute-0 python3.9[176699]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:55 compute-0 sudo[176697]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:56 compute-0 sudo[176849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvivgbmveglmidjdtawyaiaxkuksxdrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097075.8429134-1647-159463364859655/AnsiballZ_file.py'
Nov 25 18:57:56 compute-0 sudo[176849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:56 compute-0 python3.9[176851]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:56 compute-0 sudo[176849]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:57 compute-0 sudo[177001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbemorirjgooyjthsjxlhedhxmsqeduk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097076.6298296-1647-214150446771348/AnsiballZ_file.py'
Nov 25 18:57:57 compute-0 sudo[177001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:57 compute-0 python3.9[177003]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:57 compute-0 sudo[177001]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:57 compute-0 sudo[177153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uchhoscigeojldjzgezxdrqqhpytsnqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097077.469996-1647-151828811137438/AnsiballZ_file.py'
Nov 25 18:57:57 compute-0 sudo[177153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:58 compute-0 python3.9[177155]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:58 compute-0 sudo[177153]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:58 compute-0 sudo[177305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjanqxgnqvszilwgsiouguilamdriqcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097078.2716997-1647-101890804425634/AnsiballZ_file.py'
Nov 25 18:57:58 compute-0 sudo[177305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:58 compute-0 python3.9[177307]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:58 compute-0 sudo[177305]: pam_unix(sudo:session): session closed for user root
Nov 25 18:57:59 compute-0 sudo[177457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkiigtywkapxsqdaoneorasfgqkfcduj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097079.0160024-1647-203048705352702/AnsiballZ_file.py'
Nov 25 18:57:59 compute-0 sudo[177457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:57:59 compute-0 python3.9[177459]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:57:59 compute-0 sudo[177457]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:00 compute-0 sudo[177609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbjzmkawodvbzqifnhywtolkqpfvbsyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097079.8718338-1763-189319497336880/AnsiballZ_command.py'
Nov 25 18:58:00 compute-0 sudo[177609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:00 compute-0 python3.9[177611]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:58:00 compute-0 sudo[177609]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:01 compute-0 python3.9[177763]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 18:58:02 compute-0 sudo[177913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oziuxfdgbpqracluqadfybuutwwoqsyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097081.8391054-1799-55424827982151/AnsiballZ_systemd_service.py'
Nov 25 18:58:02 compute-0 sudo[177913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:02 compute-0 python3.9[177915]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:58:02 compute-0 systemd[1]: Reloading.
Nov 25 18:58:02 compute-0 systemd-rc-local-generator[177937]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:58:02 compute-0 systemd-sysv-generator[177944]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:58:02 compute-0 sudo[177913]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:03 compute-0 sudo[178099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haqvbaziuyqhdzvgamgbotzpieuamoqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097083.1461303-1815-251874514236726/AnsiballZ_command.py'
Nov 25 18:58:03 compute-0 sudo[178099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:03 compute-0 python3.9[178101]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:58:03 compute-0 sudo[178099]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:04 compute-0 sudo[178252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjdyxtuqsvyixpoxgyoxnxiwhdqseoaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097083.9390538-1815-279061840135882/AnsiballZ_command.py'
Nov 25 18:58:04 compute-0 sudo[178252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:04 compute-0 python3.9[178254]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:58:04 compute-0 sudo[178252]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:04 compute-0 podman[178256]: 2025-11-25 18:58:04.717662897 +0000 UTC m=+0.129642324 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 18:58:05 compute-0 sudo[178431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjyvqxrgrablannpwmonxheriamidagm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097084.8681242-1815-250539892373848/AnsiballZ_command.py'
Nov 25 18:58:05 compute-0 sudo[178431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:05 compute-0 python3.9[178433]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:58:05 compute-0 sudo[178431]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:06 compute-0 sudo[178584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weblteuybbjjbvxdkhowpysjndbfaubv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097085.647961-1815-141430794128365/AnsiballZ_command.py'
Nov 25 18:58:06 compute-0 sudo[178584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:06 compute-0 python3.9[178586]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:58:06 compute-0 sudo[178584]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:06 compute-0 sudo[178737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfgryvebctwmbqxvjgzeqccitfinksai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097086.5287924-1815-279585757644038/AnsiballZ_command.py'
Nov 25 18:58:06 compute-0 sudo[178737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:07 compute-0 podman[178739]: 2025-11-25 18:58:07.021926076 +0000 UTC m=+0.084841865 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 18:58:07 compute-0 python3.9[178740]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:58:07 compute-0 sudo[178737]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:07 compute-0 sudo[178909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtknquufqnnrzvqwkbvvftrwqwwyncnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097087.3966386-1815-29375771953941/AnsiballZ_command.py'
Nov 25 18:58:07 compute-0 sudo[178909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:07 compute-0 python3.9[178911]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:58:08 compute-0 sudo[178909]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:08 compute-0 sudo[179062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-almhwsyqgrotgudwdijajmlnejtuqskm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097088.2049682-1815-141402089454651/AnsiballZ_command.py'
Nov 25 18:58:08 compute-0 sudo[179062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:08 compute-0 python3.9[179064]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:58:09 compute-0 sudo[179062]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:10 compute-0 sudo[179215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xapdgglqhipmfzoakdukawfxgzyzlaqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097090.0080092-1815-61528067429868/AnsiballZ_command.py'
Nov 25 18:58:10 compute-0 sudo[179215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:10 compute-0 python3.9[179217]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:58:10 compute-0 sudo[179215]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:12 compute-0 sudo[179368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcqpgulyshkxmqyjachefdrxisnvexeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097091.7662075-1958-210411429486825/AnsiballZ_file.py'
Nov 25 18:58:12 compute-0 sudo[179368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:12 compute-0 python3.9[179370]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:12 compute-0 sudo[179368]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:12 compute-0 sudo[179520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zheiaufyicyaofazgnnpxvjbfhthopts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097092.565835-1958-57274369816621/AnsiballZ_file.py'
Nov 25 18:58:12 compute-0 sudo[179520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:13 compute-0 python3.9[179522]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:13 compute-0 sudo[179520]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:13 compute-0 sudo[179672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gccevcqolsnsxomoqoalnmncbkewhrwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097093.3773239-1958-34321179286037/AnsiballZ_file.py'
Nov 25 18:58:13 compute-0 sudo[179672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:13 compute-0 python3.9[179674]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:13 compute-0 sudo[179672]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:14 compute-0 sudo[179839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exbpvvlmnmvuptgwikwmhzsarqomtkul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097094.177789-2002-230602048421565/AnsiballZ_file.py'
Nov 25 18:58:14 compute-0 sudo[179839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:14 compute-0 podman[179798]: 2025-11-25 18:58:14.591426399 +0000 UTC m=+0.092180625 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Nov 25 18:58:14 compute-0 python3.9[179846]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:14 compute-0 sudo[179839]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:15 compute-0 sudo[179997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkssrzaplfrjomkakwavwfiqqmsapclg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097095.0250204-2002-169639266492307/AnsiballZ_file.py'
Nov 25 18:58:15 compute-0 sudo[179997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:15 compute-0 python3.9[179999]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:15 compute-0 sudo[179997]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:16 compute-0 sudo[180149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdpmelucrirfbelldbjhrbyswthycbtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097095.8060653-2002-200692739571448/AnsiballZ_file.py'
Nov 25 18:58:16 compute-0 sudo[180149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:16 compute-0 python3.9[180151]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:16 compute-0 sudo[180149]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:17 compute-0 sudo[180301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efaxskxlocopmxpbuxduxhrcyarpcsgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097096.6455386-2002-226413079899138/AnsiballZ_file.py'
Nov 25 18:58:17 compute-0 sudo[180301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:17 compute-0 python3.9[180303]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:17 compute-0 sudo[180301]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:17 compute-0 sudo[180453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clhrnpvxkqneekbznwdapibgddvkwlut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097097.456956-2002-48330009946338/AnsiballZ_file.py'
Nov 25 18:58:17 compute-0 sudo[180453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:18 compute-0 python3.9[180455]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:18 compute-0 sudo[180453]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:18 compute-0 sudo[180605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hydsvoqlqahalumywnhaxlirygikjunj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097098.3050783-2002-144654991846566/AnsiballZ_file.py'
Nov 25 18:58:18 compute-0 sudo[180605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:18 compute-0 python3.9[180607]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:18 compute-0 sudo[180605]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:19 compute-0 sudo[180757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boaeiqwyckmehswcciliuffscjhaajeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097099.1457586-2002-6591259929004/AnsiballZ_file.py'
Nov 25 18:58:19 compute-0 sudo[180757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:19 compute-0 python3.9[180759]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:19 compute-0 sudo[180757]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:24 compute-0 sudo[180909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgyutgsaetzgvbyowipdhupbybcnkbwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097104.4877043-2239-156910553655230/AnsiballZ_getent.py'
Nov 25 18:58:24 compute-0 sudo[180909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:25 compute-0 python3.9[180911]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 25 18:58:25 compute-0 sudo[180909]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:25 compute-0 sudo[181062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhcbqkeoqgufvemfyudsuhizbqisbeen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097105.463597-2255-150019918889845/AnsiballZ_group.py'
Nov 25 18:58:25 compute-0 sudo[181062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:26 compute-0 python3.9[181064]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 18:58:26 compute-0 groupadd[181065]: group added to /etc/group: name=nova, GID=42436
Nov 25 18:58:26 compute-0 groupadd[181065]: group added to /etc/gshadow: name=nova
Nov 25 18:58:26 compute-0 groupadd[181065]: new group: name=nova, GID=42436
Nov 25 18:58:26 compute-0 sudo[181062]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:27 compute-0 sudo[181220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaybqjtmbtuyeizebqxiiynzwnsmlrqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097106.5096135-2271-106532135699429/AnsiballZ_user.py'
Nov 25 18:58:27 compute-0 sudo[181220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:27 compute-0 python3.9[181222]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 18:58:27 compute-0 useradd[181224]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 25 18:58:27 compute-0 useradd[181224]: add 'nova' to group 'libvirt'
Nov 25 18:58:27 compute-0 useradd[181224]: add 'nova' to shadow group 'libvirt'
Nov 25 18:58:27 compute-0 sudo[181220]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:28 compute-0 sshd-session[181255]: Accepted publickey for zuul from 192.168.122.30 port 53510 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:58:28 compute-0 systemd-logind[820]: New session 25 of user zuul.
Nov 25 18:58:28 compute-0 systemd[1]: Started Session 25 of User zuul.
Nov 25 18:58:28 compute-0 sshd-session[181255]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:58:28 compute-0 sshd-session[181258]: Received disconnect from 192.168.122.30 port 53510:11: disconnected by user
Nov 25 18:58:28 compute-0 sshd-session[181258]: Disconnected from user zuul 192.168.122.30 port 53510
Nov 25 18:58:28 compute-0 sshd-session[181255]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:58:28 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Nov 25 18:58:28 compute-0 systemd-logind[820]: Session 25 logged out. Waiting for processes to exit.
Nov 25 18:58:28 compute-0 systemd-logind[820]: Removed session 25.
Nov 25 18:58:29 compute-0 python3.9[181408]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:58:30 compute-0 python3.9[181529]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097108.9457073-2321-3445761833641/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:30 compute-0 python3.9[181679]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:58:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:58:31.047 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:58:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:58:31.048 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:58:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:58:31.048 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:58:31 compute-0 python3.9[181756]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:32 compute-0 python3.9[181906]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:58:32 compute-0 python3.9[182027]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097111.5506487-2321-31806536805959/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:33 compute-0 python3.9[182177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:58:34 compute-0 python3.9[182298]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097113.1274126-2321-179615109704325/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:34 compute-0 podman[182422]: 2025-11-25 18:58:34.976969877 +0000 UTC m=+0.087088919 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 18:58:35 compute-0 python3.9[182459]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:58:35 compute-0 python3.9[182593]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097114.554283-2321-74553048153776/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:36 compute-0 python3.9[182743]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:58:37 compute-0 podman[182838]: 2025-11-25 18:58:37.150193232 +0000 UTC m=+0.075808854 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 18:58:37 compute-0 python3.9[182881]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097116.1118305-2321-201573946477024/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:38 compute-0 sudo[183033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcekglfusbhjniwvvwimpccysvyznqeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097117.7015407-2487-263870096317775/AnsiballZ_file.py'
Nov 25 18:58:38 compute-0 sudo[183033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:38 compute-0 python3.9[183035]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:58:38 compute-0 sudo[183033]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:38 compute-0 sudo[183185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlsclexbqcsnaitvzuzrjtnzfbnpzfmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097118.5412319-2503-121656544219671/AnsiballZ_copy.py'
Nov 25 18:58:38 compute-0 sudo[183185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:39 compute-0 python3.9[183187]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:58:39 compute-0 sudo[183185]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:39 compute-0 sudo[183337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyrbfbznejpeuyftdmgtisehzdvtetdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097119.368361-2519-84970102793946/AnsiballZ_stat.py'
Nov 25 18:58:39 compute-0 sudo[183337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:39 compute-0 python3.9[183339]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:58:39 compute-0 sudo[183337]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:40 compute-0 sudo[183489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezvmymnfmgkusdxlhcvkkkxrtvllefsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097120.1981483-2535-76081655647960/AnsiballZ_stat.py'
Nov 25 18:58:40 compute-0 sudo[183489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:40 compute-0 python3.9[183491]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:58:40 compute-0 sudo[183489]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:41 compute-0 sudo[183612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sanmgxnbobaftgoedoixeztuynukkawm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097120.1981483-2535-76081655647960/AnsiballZ_copy.py'
Nov 25 18:58:41 compute-0 sudo[183612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:41 compute-0 python3.9[183614]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764097120.1981483-2535-76081655647960/.source _original_basename=.bo6g25sl follow=False checksum=f2e3b8542bbf221e582db5ea24e9fd0f16d2cd52 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 25 18:58:41 compute-0 sudo[183612]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:42 compute-0 python3.9[183766]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:58:43 compute-0 python3.9[183918]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:58:43 compute-0 python3.9[184039]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097122.7222648-2587-121122705356818/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=34f351ae18ce84ec41eb75cc006d1bb7cc25cdcc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:44 compute-0 python3.9[184189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:58:45 compute-0 podman[184284]: 2025-11-25 18:58:45.131431952 +0000 UTC m=+0.081055027 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Nov 25 18:58:45 compute-0 python3.9[184318]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097124.0885315-2617-68329491746258/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=663c66ce8d61547f15066ae52b23391a8d82b9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:58:46 compute-0 sudo[184480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwsmnjziflxazoqkbpsppvqqqgxztobo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097125.8786168-2651-113190748935040/AnsiballZ_container_config_data.py'
Nov 25 18:58:46 compute-0 sudo[184480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:46 compute-0 python3.9[184482]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 25 18:58:46 compute-0 sudo[184480]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:47 compute-0 sudo[184632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwftpsznqiwhesywdtcbngzfcqzizcel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097126.8355536-2669-153232075323491/AnsiballZ_container_config_hash.py'
Nov 25 18:58:47 compute-0 sudo[184632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:47 compute-0 python3.9[184634]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 18:58:47 compute-0 sudo[184632]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:48 compute-0 sudo[184784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-covcvgrjwgmvuwixjmodlbbljwsktqpr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764097127.951113-2689-169742427727178/AnsiballZ_edpm_container_manage.py'
Nov 25 18:58:48 compute-0 sudo[184784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:48 compute-0 python3[184786]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 18:58:48 compute-0 podman[184822]: 2025-11-25 18:58:48.869251506 +0000 UTC m=+0.076748160 container create cb89eaba6f8b2a3aa0a48b0b4e7d73c6e509043e60a971da655f41b930f69a8e (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Nov 25 18:58:48 compute-0 podman[184822]: 2025-11-25 18:58:48.828561939 +0000 UTC m=+0.036058633 image pull bbd9e65c99fb428dc4f8c73808d764a75903488c747752b60e55265221d7aeb4 38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Nov 25 18:58:48 compute-0 python3[184786]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 25 18:58:49 compute-0 sudo[184784]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:49 compute-0 sudo[185009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wofjiabtmozijdwfxoxtiemtibwfeopa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097129.3583612-2705-161560700356340/AnsiballZ_stat.py'
Nov 25 18:58:49 compute-0 sudo[185009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:49 compute-0 python3.9[185011]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:58:49 compute-0 sudo[185009]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:50 compute-0 sudo[185163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdjukusjtgjyuhdcfzrbtzzdfasiviyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097130.5858393-2729-213299431297951/AnsiballZ_container_config_data.py'
Nov 25 18:58:50 compute-0 sudo[185163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:51 compute-0 python3.9[185165]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 25 18:58:51 compute-0 sudo[185163]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:51 compute-0 sudo[185315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihedsxglfrytuyxgkoyzgyekcdqggkev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097131.5018008-2747-50286982799103/AnsiballZ_container_config_hash.py'
Nov 25 18:58:51 compute-0 sudo[185315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:52 compute-0 python3.9[185317]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 18:58:52 compute-0 sudo[185315]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:53 compute-0 sudo[185467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umhqgknytcvlznffomdgrknjepukgrfh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764097132.6066635-2767-270450331778522/AnsiballZ_edpm_container_manage.py'
Nov 25 18:58:53 compute-0 sudo[185467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:53 compute-0 python3[185469]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 18:58:53 compute-0 podman[185507]: 2025-11-25 18:58:53.697488925 +0000 UTC m=+0.108639459 container create 96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Nov 25 18:58:53 compute-0 podman[185507]: 2025-11-25 18:58:53.633669235 +0000 UTC m=+0.044819809 image pull bbd9e65c99fb428dc4f8c73808d764a75903488c747752b60e55265221d7aeb4 38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Nov 25 18:58:53 compute-0 python3[185469]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Nov 25 18:58:53 compute-0 sudo[185467]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:54 compute-0 sudo[185697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgjvsuohrsyhmpycpdswsudukflshhiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097134.125705-2783-24542404435616/AnsiballZ_stat.py'
Nov 25 18:58:54 compute-0 sudo[185697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:54 compute-0 python3.9[185699]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:58:54 compute-0 sudo[185697]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:55 compute-0 sudo[185851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wajayswszvpkhhiueyzotpdkwjymtyvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097135.106556-2801-122570869387596/AnsiballZ_file.py'
Nov 25 18:58:55 compute-0 sudo[185851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:55 compute-0 python3.9[185853]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:58:55 compute-0 sudo[185851]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:56 compute-0 sudo[186002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqxenvbwshlajvclmdmgumqvqwejrmcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097135.819638-2801-87398441851371/AnsiballZ_copy.py'
Nov 25 18:58:56 compute-0 sudo[186002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:56 compute-0 python3.9[186004]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764097135.819638-2801-87398441851371/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:58:56 compute-0 sudo[186002]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:56 compute-0 sudo[186078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anlydrwprgzmynesafjzxrzpmqmsvkjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097135.819638-2801-87398441851371/AnsiballZ_systemd.py'
Nov 25 18:58:56 compute-0 sudo[186078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:57 compute-0 python3.9[186080]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:58:57 compute-0 systemd[1]: Reloading.
Nov 25 18:58:57 compute-0 systemd-rc-local-generator[186108]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:58:57 compute-0 systemd-sysv-generator[186112]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:58:57 compute-0 sudo[186078]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:57 compute-0 sudo[186189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdioylzacywnxgjtpmahhwhdigzceafk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097135.819638-2801-87398441851371/AnsiballZ_systemd.py'
Nov 25 18:58:57 compute-0 sudo[186189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:58:58 compute-0 python3.9[186191]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:58:58 compute-0 systemd[1]: Reloading.
Nov 25 18:58:58 compute-0 systemd-rc-local-generator[186223]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:58:58 compute-0 systemd-sysv-generator[186228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:58:58 compute-0 systemd[1]: Starting nova_compute container...
Nov 25 18:58:58 compute-0 systemd[1]: Started libcrun container.
Nov 25 18:58:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 25 18:58:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 18:58:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 18:58:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 25 18:58:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 25 18:58:58 compute-0 podman[186233]: 2025-11-25 18:58:58.967076264 +0000 UTC m=+0.246846185 container init 96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Nov 25 18:58:58 compute-0 podman[186233]: 2025-11-25 18:58:58.98136356 +0000 UTC m=+0.261133431 container start 96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 18:58:58 compute-0 nova_compute[186249]: + sudo -E kolla_set_configs
Nov 25 18:58:59 compute-0 podman[186233]: nova_compute
Nov 25 18:58:59 compute-0 systemd[1]: Started nova_compute container.
Nov 25 18:58:59 compute-0 sudo[186189]: pam_unix(sudo:session): session closed for user root
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Validating config file
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Copying service configuration files
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Deleting /etc/ceph
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Creating directory /etc/ceph
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /etc/ceph
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Writing out command to execute
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 18:58:59 compute-0 nova_compute[186249]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 18:58:59 compute-0 nova_compute[186249]: ++ cat /run_command
Nov 25 18:58:59 compute-0 nova_compute[186249]: + CMD=nova-compute
Nov 25 18:58:59 compute-0 nova_compute[186249]: + ARGS=
Nov 25 18:58:59 compute-0 nova_compute[186249]: + sudo kolla_copy_cacerts
Nov 25 18:58:59 compute-0 nova_compute[186249]: + [[ ! -n '' ]]
Nov 25 18:58:59 compute-0 nova_compute[186249]: + . kolla_extend_start
Nov 25 18:58:59 compute-0 nova_compute[186249]: Running command: 'nova-compute'
Nov 25 18:58:59 compute-0 nova_compute[186249]: + echo 'Running command: '\''nova-compute'\'''
Nov 25 18:58:59 compute-0 nova_compute[186249]: + umask 0022
Nov 25 18:58:59 compute-0 nova_compute[186249]: + exec nova-compute
Nov 25 18:59:00 compute-0 python3.9[186410]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:59:01 compute-0 nova_compute[186249]: 2025-11-25 18:59:01.035 186253 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Nov 25 18:59:01 compute-0 nova_compute[186249]: 2025-11-25 18:59:01.035 186253 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Nov 25 18:59:01 compute-0 nova_compute[186249]: 2025-11-25 18:59:01.035 186253 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Nov 25 18:59:01 compute-0 nova_compute[186249]: 2025-11-25 18:59:01.035 186253 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 25 18:59:01 compute-0 nova_compute[186249]: 2025-11-25 18:59:01.216 186253 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 18:59:01 compute-0 nova_compute[186249]: 2025-11-25 18:59:01.248 186253 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 18:59:01 compute-0 nova_compute[186249]: 2025-11-25 18:59:01.249 186253 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Nov 25 18:59:01 compute-0 nova_compute[186249]: 2025-11-25 18:59:01.285 186253 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Nov 25 18:59:01 compute-0 nova_compute[186249]: 2025-11-25 18:59:01.287 186253 WARNING oslo_config.cfg [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Nov 25 18:59:01 compute-0 python3.9[186563]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.319 186253 INFO nova.virt.driver [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.439 186253 INFO nova.compute.provider_config [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 25 18:59:02 compute-0 python3.9[186713]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.947 186253 DEBUG oslo_concurrency.lockutils [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.948 186253 DEBUG oslo_concurrency.lockutils [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.949 186253 DEBUG oslo_concurrency.lockutils [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.950 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.950 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.951 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.951 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.952 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.952 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.953 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.953 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.953 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.954 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.954 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.955 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.955 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.955 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.956 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.956 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.957 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.957 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.957 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.958 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.958 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.958 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.958 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.959 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.959 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.959 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.959 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.960 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.960 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.960 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.960 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.961 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.961 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.961 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.961 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.961 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.962 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.962 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.962 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.962 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.963 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.963 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.963 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.963 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.964 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.964 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.964 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.964 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.965 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.965 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.965 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.965 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.966 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.966 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.966 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.966 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.966 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.967 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.967 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.967 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.967 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.967 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.968 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.968 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.968 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.968 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.969 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.969 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.969 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.969 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.969 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.970 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.970 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.970 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.970 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.971 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.971 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.971 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.971 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.971 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.972 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.972 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.972 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.972 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.973 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.973 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.973 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.973 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.974 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.974 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.974 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.974 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.975 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.975 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.975 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.975 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.975 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.976 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.976 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.976 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.976 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.976 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.977 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.977 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.977 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.977 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.978 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.978 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.978 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.978 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.978 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.979 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.979 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.979 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.979 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.980 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.980 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.980 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.980 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.980 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.981 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.981 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.981 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.981 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.981 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.982 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.982 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.982 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.982 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.983 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.983 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.983 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.983 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.983 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.984 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.984 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.984 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.984 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.985 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.985 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.985 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.985 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.985 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.986 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.986 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.986 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.986 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.987 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.987 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.987 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.987 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.988 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.988 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.988 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.988 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.989 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.989 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.989 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.989 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.989 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.990 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.990 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.990 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.990 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.991 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.991 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.991 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.991 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.992 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.992 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.992 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.992 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.993 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.993 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.993 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.993 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.994 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.994 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.994 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.994 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.994 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.995 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.995 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.995 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.995 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.996 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.996 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.996 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.996 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.996 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.997 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.997 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.997 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.997 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.998 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.998 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.998 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.998 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.998 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.999 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.999 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:02 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.999 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:02.999 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.000 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.000 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.000 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.000 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.001 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.001 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.001 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.001 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.001 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.002 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.002 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.002 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.002 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.002 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.002 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.003 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.003 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.003 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.003 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.003 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.003 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.004 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.004 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.004 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.004 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.004 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.004 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.005 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.005 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.005 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.005 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.005 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.005 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.006 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.006 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.006 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.006 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.006 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.006 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.007 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.007 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.007 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.007 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.007 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.007 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.008 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.009 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.009 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.009 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.010 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.010 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.010 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.010 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.010 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.010 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.011 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.011 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.011 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.011 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.011 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.011 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.012 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.012 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.012 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.012 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.012 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.012 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.012 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.013 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.013 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.013 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.013 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.013 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.013 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.014 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.014 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.014 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.014 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.014 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.014 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.014 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.014 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.015 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.015 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.015 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.015 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.015 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.015 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.015 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.015 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.016 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.016 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.016 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.016 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.016 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.016 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.016 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.016 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.016 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.017 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.017 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.017 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.017 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.017 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.017 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.017 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.017 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.017 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.018 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.018 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.018 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.018 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.018 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.018 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.018 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.018 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.019 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.019 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.019 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.019 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.019 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.019 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.019 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.019 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.019 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.020 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.020 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.020 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.020 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.020 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.020 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.020 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.020 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.021 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.021 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.021 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.021 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.021 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.021 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.021 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.021 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.022 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.022 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.022 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.022 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.022 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.022 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.022 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.022 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.022 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.023 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.023 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.023 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.023 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.023 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.023 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.023 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.023 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.023 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.024 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.024 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.024 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.024 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.024 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.024 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.024 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.024 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.025 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.025 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.025 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.025 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.025 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.025 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.025 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.026 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.026 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.026 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.026 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.026 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.026 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.026 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.026 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.027 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.027 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.027 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.027 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.027 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.027 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.027 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.027 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.027 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.028 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.028 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.028 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.028 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.028 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.028 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.028 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.028 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.028 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.029 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.029 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.029 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.029 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.029 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.029 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.029 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.029 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.030 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.030 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.030 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.030 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.030 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.030 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.030 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.030 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.030 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.031 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.031 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.031 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.031 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.031 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.031 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.031 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.031 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.032 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.032 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.032 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.032 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.032 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.032 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.032 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.032 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.032 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.033 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.033 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.033 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.033 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.033 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.033 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.033 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.033 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.034 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.034 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.034 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.034 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.034 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.034 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.034 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.034 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.034 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.035 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.035 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.035 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.035 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.035 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.035 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.035 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.035 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.035 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.036 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.036 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.036 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.036 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.036 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.036 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.036 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.036 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.037 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.037 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.037 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.037 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.037 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.037 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.037 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.037 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.038 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.038 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.038 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.038 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.038 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.038 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.038 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.038 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.039 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.039 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.039 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.039 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.039 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.039 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.039 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.039 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.040 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.040 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.040 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.040 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.040 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.040 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.040 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.040 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.040 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.041 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.041 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.041 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.041 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.041 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.041 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.041 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.041 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.042 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.042 186253 WARNING oslo_config.cfg [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 25 18:59:03 compute-0 nova_compute[186249]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 25 18:59:03 compute-0 nova_compute[186249]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 25 18:59:03 compute-0 nova_compute[186249]: and ``live_migration_inbound_addr`` respectively.
Nov 25 18:59:03 compute-0 nova_compute[186249]: ).  Its value may be silently ignored in the future.
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.042 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.042 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.042 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.042 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.042 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.043 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.043 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.043 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.043 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.043 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.043 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.043 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.043 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.043 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.044 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.044 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.044 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.044 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.044 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.044 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.044 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.044 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.045 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.045 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.045 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.045 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.045 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.045 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.045 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.045 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.046 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.046 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.046 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.046 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.046 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.046 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.046 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.046 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.047 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.047 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.047 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.047 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.047 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.047 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.047 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.047 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.048 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.048 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.048 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.048 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.048 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.048 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.048 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.048 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.049 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.049 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.049 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.049 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.049 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.049 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.049 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.050 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.050 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.050 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.050 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.050 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.050 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.050 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.050 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.051 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.051 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.051 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.051 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.051 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.051 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.051 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.051 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.052 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.052 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.052 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.052 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.052 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.052 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.052 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.053 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.053 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.053 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.053 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.053 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.053 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.053 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.053 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.054 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.054 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.054 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.054 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.054 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.054 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.054 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.054 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.054 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.055 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.055 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.055 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.055 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.055 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.055 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.055 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.055 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.056 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.056 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.056 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.056 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.056 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.056 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.056 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.056 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.056 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.057 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.057 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.057 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.057 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.057 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.057 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.057 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.057 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.057 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.058 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.058 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.058 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.058 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.058 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.058 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.058 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.058 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.059 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.059 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.059 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.059 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.059 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.059 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.059 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.059 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.059 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.060 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.060 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.060 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.060 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.060 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.060 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.060 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.060 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.061 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.061 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.061 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.061 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.061 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.061 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.061 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.061 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.061 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.062 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.062 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.062 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.062 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.062 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.062 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.062 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.062 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.063 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.063 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.063 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.063 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.063 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.063 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.063 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.063 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.063 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.064 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.064 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.064 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.064 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.064 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.064 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.064 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.064 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.065 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.065 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.065 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.065 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.065 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.065 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.065 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.065 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.066 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.066 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.066 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.066 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.066 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.066 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.066 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.066 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.066 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.067 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.067 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.067 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.067 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.067 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.067 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.067 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.068 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.068 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.068 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.068 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.068 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.068 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.068 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.068 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.068 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.069 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.069 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.069 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.069 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.069 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.069 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.069 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.069 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.070 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.070 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.070 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.070 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.070 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.070 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.070 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.070 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.070 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.071 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.071 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.071 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.071 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.071 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.071 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.071 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.071 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.071 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.072 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.072 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.072 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.072 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.072 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.072 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.072 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.072 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.073 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.073 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.073 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.073 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.073 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.073 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.073 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.074 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.074 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.074 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.074 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.074 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.074 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.074 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.075 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.075 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.075 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.075 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.075 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.075 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.075 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.075 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.076 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.076 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.076 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.076 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.076 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.076 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.076 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.076 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.076 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.077 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.077 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.077 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.077 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.077 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.077 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.077 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.077 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.078 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.078 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.078 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.078 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.078 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.078 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.078 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.078 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.079 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.079 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.079 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.079 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.079 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.079 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.079 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.079 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.080 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.080 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.080 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.080 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.080 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.080 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.080 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.080 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.081 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.081 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.081 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.081 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.081 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.081 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.081 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.081 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.082 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.082 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.082 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.082 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.082 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.082 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.082 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.082 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.083 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.083 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.083 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.083 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.083 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.083 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.083 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.083 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.084 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.084 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.084 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.084 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.084 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.084 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.084 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.084 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.085 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.085 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.085 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.085 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.085 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.085 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.085 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.085 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.086 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.086 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.086 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.086 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.086 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.086 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.086 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.086 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.087 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.087 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.087 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.087 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.087 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.087 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.087 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.087 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.087 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.088 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.088 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.088 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.088 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.088 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.088 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.088 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.088 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.089 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.089 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.089 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.089 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.089 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.089 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.089 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.089 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.090 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.090 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.090 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.090 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.090 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.090 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.090 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.090 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.091 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.091 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.091 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.091 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.091 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.091 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.091 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.091 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.092 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.092 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.092 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.092 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.092 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.092 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.092 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.092 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.093 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.093 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.093 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.093 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.093 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.093 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.093 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.093 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.094 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.094 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.094 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.094 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.094 186253 DEBUG oslo_service.backend._eventlet.service [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.095 186253 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Nov 25 18:59:03 compute-0 sudo[186865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fckuspfsifuukvxbdblqvakxgchvlvfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097142.8511698-2921-47096578167460/AnsiballZ_podman_container.py'
Nov 25 18:59:03 compute-0 sudo[186865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.601 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Nov 25 18:59:03 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 25 18:59:03 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.707 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fbe49d6ec00> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Nov 25 18:59:03 compute-0 nova_compute[186249]: libvirt:  error : internal error: could not initialize domain event timer
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.708 186253 WARNING nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.709 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fbe49d6ec00> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.712 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.714 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.714 186253 INFO nova.utils [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] The default thread pool MainProcess.default is initialized
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.715 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Nov 25 18:59:03 compute-0 nova_compute[186249]: 2025-11-25 18:59:03.716 186253 INFO nova.virt.libvirt.driver [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Connection event '1' reason 'None'
Nov 25 18:59:03 compute-0 python3.9[186867]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 25 18:59:03 compute-0 sudo[186865]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:03 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 18:59:03 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.222 186253 WARNING nova.virt.libvirt.driver [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.224 186253 DEBUG nova.virt.libvirt.volume.mount [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.639 186253 INFO nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Libvirt host capabilities <capabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]: 
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <host>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <uuid>67ab9541-c16e-406b-be77-292a72d03114</uuid>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <arch>x86_64</arch>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model>EPYC-Rome-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <vendor>AMD</vendor>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <microcode version='16777317'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <signature family='23' model='49' stepping='0'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='x2apic'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='tsc-deadline'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='osxsave'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='hypervisor'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='tsc_adjust'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='spec-ctrl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='stibp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='arch-capabilities'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='cmp_legacy'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='topoext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='virt-ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='lbrv'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='tsc-scale'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='vmcb-clean'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='pause-filter'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='pfthreshold'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='svme-addr-chk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='rdctl-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='skip-l1dfl-vmentry'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='mds-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature name='pschange-mc-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <pages unit='KiB' size='4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <pages unit='KiB' size='2048'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <pages unit='KiB' size='1048576'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <power_management>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <suspend_mem/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <suspend_disk/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <suspend_hybrid/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </power_management>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <iommu support='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <migration_features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <live/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <uri_transports>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <uri_transport>tcp</uri_transport>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <uri_transport>rdma</uri_transport>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </uri_transports>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </migration_features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <topology>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <cells num='1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <cell id='0'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:           <memory unit='KiB'>7864316</memory>
Nov 25 18:59:04 compute-0 nova_compute[186249]:           <pages unit='KiB' size='4'>1966079</pages>
Nov 25 18:59:04 compute-0 nova_compute[186249]:           <pages unit='KiB' size='2048'>0</pages>
Nov 25 18:59:04 compute-0 nova_compute[186249]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 25 18:59:04 compute-0 nova_compute[186249]:           <distances>
Nov 25 18:59:04 compute-0 nova_compute[186249]:             <sibling id='0' value='10'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:           </distances>
Nov 25 18:59:04 compute-0 nova_compute[186249]:           <cpus num='8'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:           </cpus>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         </cell>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </cells>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </topology>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <cache>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </cache>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <secmodel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model>selinux</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <doi>0</doi>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </secmodel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <secmodel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model>dac</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <doi>0</doi>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </secmodel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </host>
Nov 25 18:59:04 compute-0 nova_compute[186249]: 
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <guest>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <os_type>hvm</os_type>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <arch name='i686'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <wordsize>32</wordsize>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <domain type='qemu'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <domain type='kvm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </arch>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <pae/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <nonpae/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <acpi default='on' toggle='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <apic default='on' toggle='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <cpuselection/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <deviceboot/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <disksnapshot default='on' toggle='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <externalSnapshot/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </guest>
Nov 25 18:59:04 compute-0 nova_compute[186249]: 
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <guest>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <os_type>hvm</os_type>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <arch name='x86_64'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <wordsize>64</wordsize>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <domain type='qemu'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <domain type='kvm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </arch>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <acpi default='on' toggle='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <apic default='on' toggle='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <cpuselection/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <deviceboot/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <disksnapshot default='on' toggle='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <externalSnapshot/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </guest>
Nov 25 18:59:04 compute-0 nova_compute[186249]: 
Nov 25 18:59:04 compute-0 nova_compute[186249]: </capabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]: 
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.647 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Nov 25 18:59:04 compute-0 sudo[187098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoqkapctjrctgfgcioepcqbwehhklimb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097144.2567167-2937-220044988670713/AnsiballZ_systemd.py'
Nov 25 18:59:04 compute-0 sudo[187098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.679 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 25 18:59:04 compute-0 nova_compute[186249]: <domainCapabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <domain>kvm</domain>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <arch>i686</arch>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <vcpu max='240'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <iothreads supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <os supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <enum name='firmware'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <loader supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>rom</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pflash</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='readonly'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>yes</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>no</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='secure'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>no</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </loader>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </os>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='host-passthrough' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='hostPassthroughMigratable'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>on</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>off</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='maximum' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='maximumMigratable'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>on</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>off</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='host-model' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <vendor>AMD</vendor>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='x2apic'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='hypervisor'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='stibp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='overflow-recov'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='succor'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='amd-ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='lbrv'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc-scale'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='flushbyasid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='pause-filter'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='pfthreshold'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='svme-addr-chk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='disable' name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='custom' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Dhyana-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Genoa'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='auto-ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='auto-ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-128'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-256'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-512'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v6'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v7'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='KnightsMill'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512er'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512pf'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='KnightsMill-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512er'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512pf'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G4-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tbm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G5-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tbm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SierraForest'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cmpccxadd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SierraForest-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cmpccxadd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='athlon'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='athlon-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='core2duo'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='core2duo-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='coreduo'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='coreduo-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='n270'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='n270-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='phenom'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='phenom-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <memoryBacking supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <enum name='sourceType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>file</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>anonymous</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>memfd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </memoryBacking>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <devices>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <disk supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='diskDevice'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>disk</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>cdrom</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>floppy</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>lun</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='bus'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ide</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>fdc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>scsi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>sata</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-non-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </disk>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <graphics supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vnc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>egl-headless</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dbus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </graphics>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <video supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='modelType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vga</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>cirrus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>none</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>bochs</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ramfb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </video>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <hostdev supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='mode'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>subsystem</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='startupPolicy'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>default</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>mandatory</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>requisite</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>optional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='subsysType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pci</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>scsi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='capsType'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='pciBackend'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </hostdev>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <rng supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-non-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>random</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>egd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>builtin</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </rng>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <filesystem supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='driverType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>path</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>handle</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtiofs</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </filesystem>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <tpm supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tpm-tis</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tpm-crb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>emulator</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>external</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendVersion'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>2.0</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </tpm>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <redirdev supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='bus'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </redirdev>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <channel supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pty</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>unix</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </channel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <crypto supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>qemu</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>builtin</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </crypto>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <interface supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>default</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>passt</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </interface>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <panic supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>isa</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>hyperv</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </panic>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <console supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>null</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pty</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dev</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>file</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pipe</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>stdio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>udp</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tcp</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>unix</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>qemu-vdagent</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dbus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </console>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </devices>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <gic supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <vmcoreinfo supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <genid supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <backingStoreInput supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <backup supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <async-teardown supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <ps2 supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <sev supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <sgx supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <hyperv supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='features'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>relaxed</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vapic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>spinlocks</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vpindex</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>runtime</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>synic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>stimer</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>reset</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vendor_id</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>frequencies</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>reenlightenment</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tlbflush</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ipi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>avic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>emsr_bitmap</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>xmm_input</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <defaults>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <spinlocks>4095</spinlocks>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <stimer_direct>on</stimer_direct>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </defaults>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </hyperv>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <launchSecurity supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='sectype'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tdx</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </launchSecurity>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </features>
Nov 25 18:59:04 compute-0 nova_compute[186249]: </domainCapabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.689 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 25 18:59:04 compute-0 nova_compute[186249]: <domainCapabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <domain>kvm</domain>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <arch>i686</arch>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <vcpu max='4096'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <iothreads supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <os supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <enum name='firmware'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <loader supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>rom</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pflash</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='readonly'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>yes</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>no</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='secure'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>no</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </loader>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </os>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='host-passthrough' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='hostPassthroughMigratable'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>on</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>off</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='maximum' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='maximumMigratable'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>on</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>off</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='host-model' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <vendor>AMD</vendor>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='x2apic'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='hypervisor'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='stibp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='overflow-recov'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='succor'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='amd-ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='lbrv'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc-scale'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='flushbyasid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='pause-filter'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='pfthreshold'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='svme-addr-chk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='disable' name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='custom' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Dhyana-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Genoa'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='auto-ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='auto-ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-128'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-256'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-512'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v6'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v7'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='KnightsMill'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512er'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512pf'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='KnightsMill-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512er'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512pf'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G4-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tbm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G5-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tbm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SierraForest'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cmpccxadd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SierraForest-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cmpccxadd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='athlon'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='athlon-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='core2duo'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='core2duo-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='coreduo'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='coreduo-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='n270'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='n270-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='phenom'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='phenom-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <memoryBacking supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <enum name='sourceType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>file</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>anonymous</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>memfd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </memoryBacking>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <devices>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <disk supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='diskDevice'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>disk</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>cdrom</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>floppy</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>lun</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='bus'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>fdc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>scsi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>sata</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-non-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </disk>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <graphics supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vnc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>egl-headless</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dbus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </graphics>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <video supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='modelType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vga</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>cirrus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>none</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>bochs</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ramfb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </video>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <hostdev supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='mode'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>subsystem</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='startupPolicy'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>default</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>mandatory</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>requisite</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>optional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='subsysType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pci</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>scsi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='capsType'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='pciBackend'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </hostdev>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <rng supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-non-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>random</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>egd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>builtin</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </rng>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <filesystem supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='driverType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>path</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>handle</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtiofs</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </filesystem>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <tpm supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tpm-tis</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tpm-crb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>emulator</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>external</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendVersion'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>2.0</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </tpm>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <redirdev supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='bus'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </redirdev>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <channel supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pty</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>unix</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </channel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <crypto supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>qemu</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>builtin</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </crypto>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <interface supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>default</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>passt</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </interface>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <panic supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>isa</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>hyperv</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </panic>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <console supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>null</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pty</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dev</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>file</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pipe</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>stdio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>udp</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tcp</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>unix</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>qemu-vdagent</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dbus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </console>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </devices>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <gic supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <vmcoreinfo supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <genid supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <backingStoreInput supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <backup supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <async-teardown supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <ps2 supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <sev supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <sgx supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <hyperv supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='features'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>relaxed</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vapic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>spinlocks</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vpindex</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>runtime</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>synic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>stimer</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>reset</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vendor_id</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>frequencies</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>reenlightenment</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tlbflush</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ipi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>avic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>emsr_bitmap</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>xmm_input</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <defaults>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <spinlocks>4095</spinlocks>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <stimer_direct>on</stimer_direct>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </defaults>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </hyperv>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <launchSecurity supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='sectype'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tdx</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </launchSecurity>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </features>
Nov 25 18:59:04 compute-0 nova_compute[186249]: </domainCapabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.722 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.726 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 25 18:59:04 compute-0 nova_compute[186249]: <domainCapabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <domain>kvm</domain>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <arch>x86_64</arch>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <vcpu max='240'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <iothreads supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <os supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <enum name='firmware'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <loader supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>rom</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pflash</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='readonly'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>yes</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>no</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='secure'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>no</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </loader>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </os>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='host-passthrough' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='hostPassthroughMigratable'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>on</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>off</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='maximum' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='maximumMigratable'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>on</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>off</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='host-model' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <vendor>AMD</vendor>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='x2apic'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='hypervisor'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='stibp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='overflow-recov'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='succor'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='amd-ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='lbrv'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc-scale'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='flushbyasid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='pause-filter'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='pfthreshold'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='svme-addr-chk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='disable' name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='custom' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Dhyana-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Genoa'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='auto-ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='auto-ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-128'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-256'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-512'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v6'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v7'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='KnightsMill'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512er'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512pf'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='KnightsMill-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512er'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512pf'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G4-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tbm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G5-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tbm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SierraForest'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cmpccxadd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SierraForest-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cmpccxadd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='athlon'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='athlon-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='core2duo'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='core2duo-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='coreduo'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='coreduo-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='n270'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='n270-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='phenom'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='phenom-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <memoryBacking supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <enum name='sourceType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>file</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>anonymous</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>memfd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </memoryBacking>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <devices>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <disk supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='diskDevice'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>disk</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>cdrom</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>floppy</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>lun</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='bus'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ide</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>fdc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>scsi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>sata</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-non-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </disk>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <graphics supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vnc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>egl-headless</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dbus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </graphics>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <video supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='modelType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vga</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>cirrus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>none</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>bochs</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ramfb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </video>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <hostdev supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='mode'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>subsystem</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='startupPolicy'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>default</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>mandatory</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>requisite</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>optional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='subsysType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pci</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>scsi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='capsType'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='pciBackend'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </hostdev>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <rng supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-non-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>random</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>egd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>builtin</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </rng>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <filesystem supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='driverType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>path</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>handle</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtiofs</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </filesystem>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <tpm supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tpm-tis</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tpm-crb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>emulator</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>external</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendVersion'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>2.0</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </tpm>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <redirdev supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='bus'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </redirdev>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <channel supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pty</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>unix</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </channel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <crypto supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>qemu</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>builtin</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </crypto>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <interface supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>default</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>passt</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </interface>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <panic supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>isa</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>hyperv</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </panic>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <console supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>null</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pty</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dev</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>file</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pipe</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>stdio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>udp</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tcp</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>unix</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>qemu-vdagent</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dbus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </console>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </devices>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <gic supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <vmcoreinfo supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <genid supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <backingStoreInput supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <backup supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <async-teardown supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <ps2 supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <sev supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <sgx supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <hyperv supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='features'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>relaxed</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vapic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>spinlocks</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vpindex</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>runtime</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>synic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>stimer</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>reset</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vendor_id</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>frequencies</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>reenlightenment</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tlbflush</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ipi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>avic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>emsr_bitmap</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>xmm_input</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <defaults>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <spinlocks>4095</spinlocks>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <stimer_direct>on</stimer_direct>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </defaults>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </hyperv>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <launchSecurity supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='sectype'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tdx</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </launchSecurity>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </features>
Nov 25 18:59:04 compute-0 nova_compute[186249]: </domainCapabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.790 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 25 18:59:04 compute-0 nova_compute[186249]: <domainCapabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <domain>kvm</domain>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <arch>x86_64</arch>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <vcpu max='4096'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <iothreads supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <os supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <enum name='firmware'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>efi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <loader supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>rom</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pflash</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='readonly'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>yes</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>no</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='secure'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>yes</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>no</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </loader>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </os>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='host-passthrough' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='hostPassthroughMigratable'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>on</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>off</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='maximum' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='maximumMigratable'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>on</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>off</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='host-model' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <vendor>AMD</vendor>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='x2apic'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='hypervisor'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='stibp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='overflow-recov'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='succor'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='amd-ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='lbrv'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='tsc-scale'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='flushbyasid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='pause-filter'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='pfthreshold'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='svme-addr-chk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <feature policy='disable' name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <mode name='custom' supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Broadwell-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Cooperlake-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Denverton-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Dhyana-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Genoa'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='auto-ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='auto-ibrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Milan-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amd-psfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='stibp-always-on'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-Rome-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='EPYC-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='GraniteRapids-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-128'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-256'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx10-512'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='prefetchiti'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Haswell-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v6'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Icelake-Server-v7'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='IvyBridge-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='KnightsMill'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512er'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512pf'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='KnightsMill-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512er'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512pf'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G4-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tbm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Opteron_G5-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fma4'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tbm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xop'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SapphireRapids-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='amx-tile'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-bf16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-fp16'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bitalg'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrc'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fzrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='la57'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='taa-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xfd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SierraForest'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cmpccxadd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='SierraForest-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ifma'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cmpccxadd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fbsdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='fsrs'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ibrs-all'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mcdt-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pbrsb-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='psdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='serialize'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vaes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Client-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='hle'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='rtm'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Skylake-Server-v5'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512bw'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512cd'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512dq'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512f'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='avx512vl'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='invpcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pcid'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='pku'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='mpx'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v2'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v3'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='core-capability'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='split-lock-detect'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='Snowridge-v4'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='cldemote'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='erms'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='gfni'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdir64b'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='movdiri'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='xsaves'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='athlon'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='athlon-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='core2duo'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='core2duo-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='coreduo'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='coreduo-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='n270'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='n270-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='ss'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='phenom'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <blockers model='phenom-v1'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnow'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <feature name='3dnowext'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </blockers>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </mode>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </cpu>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <memoryBacking supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <enum name='sourceType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>file</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>anonymous</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <value>memfd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </memoryBacking>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <devices>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <disk supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='diskDevice'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>disk</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>cdrom</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>floppy</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>lun</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='bus'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>fdc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>scsi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>sata</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-non-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </disk>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <graphics supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vnc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>egl-headless</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dbus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </graphics>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <video supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='modelType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vga</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>cirrus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>none</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>bochs</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ramfb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </video>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <hostdev supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='mode'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>subsystem</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='startupPolicy'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>default</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>mandatory</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>requisite</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>optional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='subsysType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pci</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>scsi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='capsType'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='pciBackend'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </hostdev>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <rng supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtio-non-transitional</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>random</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>egd</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>builtin</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </rng>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <filesystem supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='driverType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>path</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>handle</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>virtiofs</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </filesystem>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <tpm supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tpm-tis</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tpm-crb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>emulator</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>external</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendVersion'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>2.0</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </tpm>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <redirdev supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='bus'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>usb</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </redirdev>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <channel supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pty</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>unix</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </channel>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <crypto supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>qemu</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendModel'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>builtin</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </crypto>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <interface supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='backendType'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>default</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>passt</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </interface>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <panic supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='model'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>isa</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>hyperv</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </panic>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <console supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='type'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>null</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vc</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pty</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dev</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>file</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>pipe</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>stdio</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>udp</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tcp</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>unix</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>qemu-vdagent</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>dbus</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </console>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </devices>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   <features>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <gic supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <vmcoreinfo supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <genid supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <backingStoreInput supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <backup supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <async-teardown supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <ps2 supported='yes'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <sev supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <sgx supported='no'/>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <hyperv supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='features'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>relaxed</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vapic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>spinlocks</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vpindex</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>runtime</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>synic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>stimer</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>reset</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>vendor_id</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>frequencies</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>reenlightenment</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tlbflush</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>ipi</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>avic</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>emsr_bitmap</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>xmm_input</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <defaults>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <spinlocks>4095</spinlocks>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <stimer_direct>on</stimer_direct>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </defaults>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </hyperv>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     <launchSecurity supported='yes'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       <enum name='sectype'>
Nov 25 18:59:04 compute-0 nova_compute[186249]:         <value>tdx</value>
Nov 25 18:59:04 compute-0 nova_compute[186249]:       </enum>
Nov 25 18:59:04 compute-0 nova_compute[186249]:     </launchSecurity>
Nov 25 18:59:04 compute-0 nova_compute[186249]:   </features>
Nov 25 18:59:04 compute-0 nova_compute[186249]: </domainCapabilities>
Nov 25 18:59:04 compute-0 nova_compute[186249]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.849 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.849 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.850 186253 DEBUG nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.850 186253 INFO nova.virt.libvirt.host [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Secure Boot support detected
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.856 186253 INFO nova.virt.libvirt.driver [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 25 18:59:04 compute-0 nova_compute[186249]: 2025-11-25 18:59:04.857 186253 INFO nova.virt.libvirt.driver [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 25 18:59:04 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 25 18:59:04 compute-0 python3.9[187103]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 18:59:05 compute-0 systemd[1]: Stopping nova_compute container...
Nov 25 18:59:05 compute-0 nova_compute[186249]: 2025-11-25 18:59:05.051 186253 DEBUG nova.virt.libvirt.driver [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 25 18:59:05 compute-0 nova_compute[186249]:   <model>Nehalem</model>
Nov 25 18:59:05 compute-0 nova_compute[186249]: </cpu>
Nov 25 18:59:05 compute-0 nova_compute[186249]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Nov 25 18:59:05 compute-0 nova_compute[186249]: 2025-11-25 18:59:05.054 186253 DEBUG nova.virt.libvirt.driver [None req-bd349716-3820-48f7-86bb-f78fac7ec890 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Nov 25 18:59:05 compute-0 nova_compute[186249]: 2025-11-25 18:59:05.098 186253 DEBUG oslo_concurrency.lockutils [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 18:59:05 compute-0 nova_compute[186249]: 2025-11-25 18:59:05.099 186253 DEBUG oslo_concurrency.lockutils [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 18:59:05 compute-0 nova_compute[186249]: 2025-11-25 18:59:05.099 186253 DEBUG oslo_concurrency.lockutils [None req-c8cfb7e9-cb10-4e4e-be8d-0b104e16964a - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 18:59:05 compute-0 podman[187128]: 2025-11-25 18:59:05.145544244 +0000 UTC m=+0.117155519 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 18:59:05 compute-0 virtqemud[186888]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 25 18:59:05 compute-0 virtqemud[186888]: hostname: compute-0
Nov 25 18:59:05 compute-0 virtqemud[186888]: End of file while reading data: Input/output error
Nov 25 18:59:05 compute-0 systemd[1]: libpod-96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac.scope: Deactivated successfully.
Nov 25 18:59:05 compute-0 systemd[1]: libpod-96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac.scope: Consumed 3.281s CPU time.
Nov 25 18:59:05 compute-0 podman[187130]: 2025-11-25 18:59:05.645503502 +0000 UTC m=+0.602144404 container died 96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 25 18:59:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac-userdata-shm.mount: Deactivated successfully.
Nov 25 18:59:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f-merged.mount: Deactivated successfully.
Nov 25 18:59:05 compute-0 podman[187130]: 2025-11-25 18:59:05.713188257 +0000 UTC m=+0.669829169 container cleanup 96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.4)
Nov 25 18:59:05 compute-0 podman[187130]: nova_compute
Nov 25 18:59:05 compute-0 podman[187183]: nova_compute
Nov 25 18:59:05 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 25 18:59:05 compute-0 systemd[1]: Stopped nova_compute container.
Nov 25 18:59:05 compute-0 systemd[1]: Starting nova_compute container...
Nov 25 18:59:05 compute-0 systemd[1]: Started libcrun container.
Nov 25 18:59:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 25 18:59:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 25 18:59:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 25 18:59:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 25 18:59:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5116fbbdf38c645236d21dab91b78e13c9ea08a67e7106d1c471a1e0a45c98f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 25 18:59:05 compute-0 podman[187197]: 2025-11-25 18:59:05.959094186 +0000 UTC m=+0.121617160 container init 96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.4)
Nov 25 18:59:05 compute-0 podman[187197]: 2025-11-25 18:59:05.976032533 +0000 UTC m=+0.138555497 container start 96a5c6ce8bb1cc2d3dab1a133f7222c819f6d1fac875fb736c08bddb62a3e2ac (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm)
Nov 25 18:59:05 compute-0 podman[187197]: nova_compute
Nov 25 18:59:05 compute-0 nova_compute[187212]: + sudo -E kolla_set_configs
Nov 25 18:59:05 compute-0 systemd[1]: Started nova_compute container.
Nov 25 18:59:06 compute-0 sudo[187098]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Validating config file
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Copying service configuration files
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Deleting /etc/ceph
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Creating directory /etc/ceph
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /etc/ceph
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Writing out command to execute
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 25 18:59:06 compute-0 nova_compute[187212]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 25 18:59:06 compute-0 nova_compute[187212]: ++ cat /run_command
Nov 25 18:59:06 compute-0 nova_compute[187212]: + CMD=nova-compute
Nov 25 18:59:06 compute-0 nova_compute[187212]: + ARGS=
Nov 25 18:59:06 compute-0 nova_compute[187212]: + sudo kolla_copy_cacerts
Nov 25 18:59:06 compute-0 nova_compute[187212]: + [[ ! -n '' ]]
Nov 25 18:59:06 compute-0 nova_compute[187212]: + . kolla_extend_start
Nov 25 18:59:06 compute-0 nova_compute[187212]: + echo 'Running command: '\''nova-compute'\'''
Nov 25 18:59:06 compute-0 nova_compute[187212]: Running command: 'nova-compute'
Nov 25 18:59:06 compute-0 nova_compute[187212]: + umask 0022
Nov 25 18:59:06 compute-0 nova_compute[187212]: + exec nova-compute
Nov 25 18:59:06 compute-0 sudo[187373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhdazfndrxoehneompflizrkoofxwirh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097146.338764-2955-85847589759803/AnsiballZ_podman_container.py'
Nov 25 18:59:06 compute-0 sudo[187373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:06 compute-0 python3.9[187375]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 25 18:59:07 compute-0 systemd[1]: Started libpod-conmon-cb89eaba6f8b2a3aa0a48b0b4e7d73c6e509043e60a971da655f41b930f69a8e.scope.
Nov 25 18:59:07 compute-0 systemd[1]: Started libcrun container.
Nov 25 18:59:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73ef5dd347cd09e8ead6013bc7e83dd8a344452a8a5974a4169db48fe461c0f3/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 25 18:59:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73ef5dd347cd09e8ead6013bc7e83dd8a344452a8a5974a4169db48fe461c0f3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 25 18:59:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73ef5dd347cd09e8ead6013bc7e83dd8a344452a8a5974a4169db48fe461c0f3/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 25 18:59:07 compute-0 podman[187411]: 2025-11-25 18:59:07.308953266 +0000 UTC m=+0.070906823 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 18:59:07 compute-0 podman[187399]: 2025-11-25 18:59:07.319950932 +0000 UTC m=+0.157650721 container init cb89eaba6f8b2a3aa0a48b0b4e7d73c6e509043e60a971da655f41b930f69a8e (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Nov 25 18:59:07 compute-0 podman[187399]: 2025-11-25 18:59:07.328952305 +0000 UTC m=+0.166652064 container start cb89eaba6f8b2a3aa0a48b0b4e7d73c6e509043e60a971da655f41b930f69a8e (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 18:59:07 compute-0 python3.9[187375]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Applying nova statedir ownership
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 25 18:59:07 compute-0 nova_compute_init[187438]: INFO:nova_statedir:Nova statedir ownership complete
Nov 25 18:59:07 compute-0 systemd[1]: libpod-cb89eaba6f8b2a3aa0a48b0b4e7d73c6e509043e60a971da655f41b930f69a8e.scope: Deactivated successfully.
Nov 25 18:59:07 compute-0 podman[187453]: 2025-11-25 18:59:07.441822648 +0000 UTC m=+0.028824028 container died cb89eaba6f8b2a3aa0a48b0b4e7d73c6e509043e60a971da655f41b930f69a8e (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=edpm, org.label-schema.build-date=20251125, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team)
Nov 25 18:59:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb89eaba6f8b2a3aa0a48b0b4e7d73c6e509043e60a971da655f41b930f69a8e-userdata-shm.mount: Deactivated successfully.
Nov 25 18:59:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-73ef5dd347cd09e8ead6013bc7e83dd8a344452a8a5974a4169db48fe461c0f3-merged.mount: Deactivated successfully.
Nov 25 18:59:07 compute-0 podman[187453]: 2025-11-25 18:59:07.486229715 +0000 UTC m=+0.073231075 container cleanup cb89eaba6f8b2a3aa0a48b0b4e7d73c6e509043e60a971da655f41b930f69a8e (image=38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, io.buildah.version=1.41.4, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.27:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 25 18:59:07 compute-0 systemd[1]: libpod-conmon-cb89eaba6f8b2a3aa0a48b0b4e7d73c6e509043e60a971da655f41b930f69a8e.scope: Deactivated successfully.
Nov 25 18:59:07 compute-0 sudo[187373]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:08 compute-0 nova_compute[187212]: 2025-11-25 18:59:08.000 187216 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Nov 25 18:59:08 compute-0 nova_compute[187212]: 2025-11-25 18:59:08.000 187216 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Nov 25 18:59:08 compute-0 nova_compute[187212]: 2025-11-25 18:59:08.000 187216 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Nov 25 18:59:08 compute-0 nova_compute[187212]: 2025-11-25 18:59:08.000 187216 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 25 18:59:08 compute-0 nova_compute[187212]: 2025-11-25 18:59:08.114 187216 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 18:59:08 compute-0 sshd-session[159056]: Connection closed by 192.168.122.30 port 32820
Nov 25 18:59:08 compute-0 sshd-session[159053]: pam_unix(sshd:session): session closed for user zuul
Nov 25 18:59:08 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Nov 25 18:59:08 compute-0 systemd[1]: session-24.scope: Consumed 2min 21.855s CPU time.
Nov 25 18:59:08 compute-0 systemd-logind[820]: Session 24 logged out. Waiting for processes to exit.
Nov 25 18:59:08 compute-0 nova_compute[187212]: 2025-11-25 18:59:08.143 187216 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 18:59:08 compute-0 nova_compute[187212]: 2025-11-25 18:59:08.143 187216 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Nov 25 18:59:08 compute-0 systemd-logind[820]: Removed session 24.
Nov 25 18:59:08 compute-0 nova_compute[187212]: 2025-11-25 18:59:08.172 187216 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Nov 25 18:59:08 compute-0 nova_compute[187212]: 2025-11-25 18:59:08.173 187216 WARNING oslo_config.cfg [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.277 187216 INFO nova.virt.driver [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.402 187216 INFO nova.compute.provider_config [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.910 187216 DEBUG oslo_concurrency.lockutils [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.910 187216 DEBUG oslo_concurrency.lockutils [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.911 187216 DEBUG oslo_concurrency.lockutils [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.911 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.912 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.912 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.912 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.913 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.913 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.913 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.914 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.914 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.914 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.914 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.915 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.915 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.915 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.915 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.916 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.916 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.916 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.916 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.917 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.917 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.917 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.917 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.918 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.918 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.918 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.919 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.919 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.919 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.919 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.920 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.920 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.920 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.920 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.921 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.921 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.921 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.921 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.922 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.922 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.922 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.923 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.923 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.923 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.924 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.924 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.924 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.924 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.925 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.925 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.925 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.925 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.926 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.926 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.926 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.926 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.927 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.927 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.927 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.928 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.928 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.928 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.928 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.928 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.929 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.929 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.929 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.929 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.930 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.930 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.930 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.930 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.931 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.931 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.931 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.931 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.932 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.932 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.932 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.932 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.933 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.933 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.933 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.934 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.934 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.934 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.934 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.935 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.935 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.935 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.935 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.936 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.936 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.936 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.936 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.937 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.937 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.937 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.937 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.938 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.938 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.938 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.938 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.939 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.939 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.939 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.939 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.940 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.940 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.940 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.940 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.941 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.941 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.941 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.941 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.942 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.942 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.942 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.942 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.943 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.943 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.943 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.943 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.944 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.944 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.944 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.944 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.945 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.945 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.945 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.945 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.946 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.946 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.946 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.946 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.947 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.947 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.947 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.947 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.948 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.948 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.948 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.948 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.949 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.949 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.949 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.949 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.950 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.950 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.950 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.951 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.951 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.951 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.951 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.952 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.952 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.952 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.952 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.953 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.953 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.953 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.954 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.954 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.954 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.954 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.955 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.955 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.955 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.955 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.956 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.956 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.956 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.956 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.957 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.957 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.957 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.957 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.958 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.958 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.958 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.959 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.959 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.959 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.959 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.960 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.960 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.960 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.960 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.961 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.961 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.961 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.961 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.962 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.962 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.962 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.962 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.963 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.963 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.963 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.963 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.963 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.963 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.964 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.964 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.964 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.964 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.964 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.964 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.964 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.965 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.965 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.965 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.965 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.965 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.965 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.966 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.966 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.966 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.966 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.966 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.966 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.967 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.967 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.967 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.967 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.967 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.967 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.968 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.968 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.968 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.968 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.968 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.968 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.969 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.969 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.969 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.969 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.970 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.970 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.970 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.970 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.970 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.970 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.971 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.971 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.971 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.971 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.971 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.972 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.972 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.972 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.972 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.972 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.972 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.973 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.973 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.973 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.973 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.973 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.973 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.974 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.974 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.974 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.974 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.974 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.974 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.975 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.975 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.975 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.975 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.975 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.976 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.976 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.976 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.976 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.976 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.976 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.977 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.977 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.977 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.977 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.977 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.977 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.978 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.978 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.978 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.978 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.978 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.978 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.979 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.979 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.979 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.979 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.979 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.979 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.980 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.980 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.980 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.980 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.980 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.980 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.980 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.981 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.981 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.981 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.981 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.981 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.981 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.982 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.982 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.982 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.982 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.982 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.982 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.983 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.983 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.983 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.983 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.983 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.983 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.984 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.984 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.984 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.984 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.984 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.984 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.985 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.985 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.985 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.985 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.985 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.987 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.987 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.987 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.987 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.988 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.988 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.988 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.988 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.988 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.988 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.989 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.989 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.989 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.989 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.989 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.989 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.990 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.990 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.990 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.990 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.990 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.990 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.991 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.991 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.991 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.991 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.991 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.991 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.991 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.992 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.992 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.992 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.992 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.992 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.992 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.993 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.993 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.993 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.993 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.993 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.993 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.994 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.994 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.994 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.994 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.994 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.995 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.995 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.995 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.995 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.996 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.996 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.996 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.996 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.996 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.996 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.997 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.997 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.997 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.997 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.997 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.997 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.998 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.998 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.998 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.998 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.998 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.999 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.999 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.999 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.999 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.999 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.999 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:09 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.999 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:09.999 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.000 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.000 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.000 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.000 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.000 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.000 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.000 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.000 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.001 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.001 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.001 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.001 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.001 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.001 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.001 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.001 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.002 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.002 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.002 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.002 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.002 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.002 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.002 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.002 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.003 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.003 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.003 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.003 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.003 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.003 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.003 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.004 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.004 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.004 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.004 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.004 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.004 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.004 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.004 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.004 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.005 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.005 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.005 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.005 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.005 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.005 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.005 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.006 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.006 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.006 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.006 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.006 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.006 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.006 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.006 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.006 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.007 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.007 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.007 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.007 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.007 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.007 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.007 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.007 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.008 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.008 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.008 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.008 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.008 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.008 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.008 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.008 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.009 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.009 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.009 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.009 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.009 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.009 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.009 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.009 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.010 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.010 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.010 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.010 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.010 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.010 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.010 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.011 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.011 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.011 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.011 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.011 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.011 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.011 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.012 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.012 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.012 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.012 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.012 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.012 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.013 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.013 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.013 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.013 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.013 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.013 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.013 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.014 187216 WARNING oslo_config.cfg [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 25 18:59:10 compute-0 nova_compute[187212]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 25 18:59:10 compute-0 nova_compute[187212]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 25 18:59:10 compute-0 nova_compute[187212]: and ``live_migration_inbound_addr`` respectively.
Nov 25 18:59:10 compute-0 nova_compute[187212]: ).  Its value may be silently ignored in the future.
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.014 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.014 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.014 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.014 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.014 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.014 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.014 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.015 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.015 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.015 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.015 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.015 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.015 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.015 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.015 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.016 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.016 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.016 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.016 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.016 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.016 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.016 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.016 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.017 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.017 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.017 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.017 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.017 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.017 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.017 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.017 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.018 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.018 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.018 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.018 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.018 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.018 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.018 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.018 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.019 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.019 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.019 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.019 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.019 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.019 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.019 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.019 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.020 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.020 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.020 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.020 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.020 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.020 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.020 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.020 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.021 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.021 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.021 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.021 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.021 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.021 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.021 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.021 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.021 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.022 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.022 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.022 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.022 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.022 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.022 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.022 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.023 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.023 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.023 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.023 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.023 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.023 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.023 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.023 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.023 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.024 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.024 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.024 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.024 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.024 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.024 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.024 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.024 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.025 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.025 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.025 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.025 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.025 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.025 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.025 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.025 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.026 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.026 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.026 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.026 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.026 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.026 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.026 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.026 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.027 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.027 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.027 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.027 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.027 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.027 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.027 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.028 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.028 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.028 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.028 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.028 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.028 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.028 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.028 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.029 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.029 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.029 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.029 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.029 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.029 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.029 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.029 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.030 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.030 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.030 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.030 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.030 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.030 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.030 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.030 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.031 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.031 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.031 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.031 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.031 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.031 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.031 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.031 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.031 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.032 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.032 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.032 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.032 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.032 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.032 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.032 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.032 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.033 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.033 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.033 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.033 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.033 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.033 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.033 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.033 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.034 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.034 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.034 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.034 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.034 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.034 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.035 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.035 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.035 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.035 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.035 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.035 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.035 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.036 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.036 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.036 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.036 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.036 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.036 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.036 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.036 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.037 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.037 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.037 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.037 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.037 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.037 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.037 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.038 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.038 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.038 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.038 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.038 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.038 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.038 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.038 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.039 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.039 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.039 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.039 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.039 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.039 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.039 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.039 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.039 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.040 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.040 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.040 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.040 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.040 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.040 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.040 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.041 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.041 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.041 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.041 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.041 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.041 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.041 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.041 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.042 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.042 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.042 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.042 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.042 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.042 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.042 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.042 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.043 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.043 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.043 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.043 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.043 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.043 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.043 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.044 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.044 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.044 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.044 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.044 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.044 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.045 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.045 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.045 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.045 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.045 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.045 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.045 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.045 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.045 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.046 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.046 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.046 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.046 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.046 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.046 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.046 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.046 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.047 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.047 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.047 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.047 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.047 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.047 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.047 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.047 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.048 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.048 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.048 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.048 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.048 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.048 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.048 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.048 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.048 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.049 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.049 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.049 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.049 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.049 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.049 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.049 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.049 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.050 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.050 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.050 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.050 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.050 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.050 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.050 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.050 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.050 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.051 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.051 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.051 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.051 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.051 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.051 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.051 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.051 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.051 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.052 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.052 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.052 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.052 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.052 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.052 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.052 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.052 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.053 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.053 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.053 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.053 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.053 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.053 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.053 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.053 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.053 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.054 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.054 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.054 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.054 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.054 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.054 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.054 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.054 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.054 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.055 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.055 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.055 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.055 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.055 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.055 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.055 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.055 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.055 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.056 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.056 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.056 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.056 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.056 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.056 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.056 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.056 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.057 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.057 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.057 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.057 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.057 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.057 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.057 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.057 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.057 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.058 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.058 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.058 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.058 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.058 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.058 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.058 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.058 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.058 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.059 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.059 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.059 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.059 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.059 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.059 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.059 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.059 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.059 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.060 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.060 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.060 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.060 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.060 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.060 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.060 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.060 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.060 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.061 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.061 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.061 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.061 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.061 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.061 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.061 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.061 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.061 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.062 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.062 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.062 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.062 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.062 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.062 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.062 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.062 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.062 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.063 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.063 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.063 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.063 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.063 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.063 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.063 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.063 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.064 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.064 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.064 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.064 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.064 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.064 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.064 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.064 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.064 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.065 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.065 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.065 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.065 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.065 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.065 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.065 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.065 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.065 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.066 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.066 187216 DEBUG oslo_service.backend._eventlet.service [None req-0c68e3b5-cec5-4888-8af4-44fd21476b5c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.066 187216 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.572 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.591 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f87c4545430> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Nov 25 18:59:10 compute-0 nova_compute[187212]: libvirt:  error : internal error: could not initialize domain event timer
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.593 187216 WARNING nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.593 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f87c4545430> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.596 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.597 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.597 187216 INFO nova.utils [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] The default thread pool MainProcess.default is initialized
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.598 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.599 187216 INFO nova.virt.libvirt.driver [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Connection event '1' reason 'None'
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.613 187216 INFO nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Libvirt host capabilities <capabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]: 
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <host>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <uuid>67ab9541-c16e-406b-be77-292a72d03114</uuid>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <arch>x86_64</arch>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model>EPYC-Rome-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <vendor>AMD</vendor>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <microcode version='16777317'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <signature family='23' model='49' stepping='0'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='x2apic'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='tsc-deadline'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='osxsave'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='hypervisor'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='tsc_adjust'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='spec-ctrl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='stibp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='arch-capabilities'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='cmp_legacy'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='topoext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='virt-ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='lbrv'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='tsc-scale'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='vmcb-clean'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='pause-filter'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='pfthreshold'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='svme-addr-chk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='rdctl-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='skip-l1dfl-vmentry'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='mds-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature name='pschange-mc-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <pages unit='KiB' size='4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <pages unit='KiB' size='2048'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <pages unit='KiB' size='1048576'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <power_management>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <suspend_mem/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <suspend_disk/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <suspend_hybrid/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </power_management>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <iommu support='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <migration_features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <live/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <uri_transports>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <uri_transport>tcp</uri_transport>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <uri_transport>rdma</uri_transport>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </uri_transports>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </migration_features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <topology>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <cells num='1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <cell id='0'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:           <memory unit='KiB'>7864316</memory>
Nov 25 18:59:10 compute-0 nova_compute[187212]:           <pages unit='KiB' size='4'>1966079</pages>
Nov 25 18:59:10 compute-0 nova_compute[187212]:           <pages unit='KiB' size='2048'>0</pages>
Nov 25 18:59:10 compute-0 nova_compute[187212]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 25 18:59:10 compute-0 nova_compute[187212]:           <distances>
Nov 25 18:59:10 compute-0 nova_compute[187212]:             <sibling id='0' value='10'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:           </distances>
Nov 25 18:59:10 compute-0 nova_compute[187212]:           <cpus num='8'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:           </cpus>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         </cell>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </cells>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </topology>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <cache>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </cache>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <secmodel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model>selinux</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <doi>0</doi>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </secmodel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <secmodel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model>dac</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <doi>0</doi>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </secmodel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </host>
Nov 25 18:59:10 compute-0 nova_compute[187212]: 
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <guest>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <os_type>hvm</os_type>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <arch name='i686'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <wordsize>32</wordsize>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <domain type='qemu'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <domain type='kvm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </arch>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <pae/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <nonpae/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <acpi default='on' toggle='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <apic default='on' toggle='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <cpuselection/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <deviceboot/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <disksnapshot default='on' toggle='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <externalSnapshot/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </guest>
Nov 25 18:59:10 compute-0 nova_compute[187212]: 
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <guest>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <os_type>hvm</os_type>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <arch name='x86_64'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <wordsize>64</wordsize>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <domain type='qemu'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <domain type='kvm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </arch>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <acpi default='on' toggle='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <apic default='on' toggle='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <cpuselection/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <deviceboot/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <disksnapshot default='on' toggle='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <externalSnapshot/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </guest>
Nov 25 18:59:10 compute-0 nova_compute[187212]: 
Nov 25 18:59:10 compute-0 nova_compute[187212]: </capabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]: 
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.623 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.628 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 25 18:59:10 compute-0 nova_compute[187212]: <domainCapabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <domain>kvm</domain>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <arch>i686</arch>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <vcpu max='240'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <iothreads supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <os supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <enum name='firmware'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <loader supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>rom</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pflash</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='readonly'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>yes</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>no</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='secure'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>no</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </loader>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </os>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='host-passthrough' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='hostPassthroughMigratable'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>on</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>off</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='maximum' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='maximumMigratable'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>on</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>off</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='host-model' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <vendor>AMD</vendor>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='x2apic'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='hypervisor'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='stibp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='overflow-recov'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='succor'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='amd-ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='lbrv'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc-scale'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='flushbyasid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='pause-filter'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='pfthreshold'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='svme-addr-chk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='disable' name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='custom' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Dhyana-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Genoa'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='auto-ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='auto-ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-128'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-256'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-512'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v6'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v7'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='KnightsMill'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512er'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512pf'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='KnightsMill-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512er'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512pf'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G4-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tbm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G5-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tbm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SierraForest'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cmpccxadd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SierraForest-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cmpccxadd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='athlon'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='athlon-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='core2duo'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='core2duo-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='coreduo'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='coreduo-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='n270'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='n270-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='phenom'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='phenom-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <memoryBacking supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <enum name='sourceType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>file</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>anonymous</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>memfd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </memoryBacking>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <devices>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <disk supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='diskDevice'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>disk</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>cdrom</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>floppy</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>lun</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='bus'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ide</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>fdc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>scsi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>sata</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-non-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </disk>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <graphics supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vnc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>egl-headless</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dbus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </graphics>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <video supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='modelType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vga</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>cirrus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>none</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>bochs</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ramfb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </video>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <hostdev supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='mode'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>subsystem</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='startupPolicy'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>default</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>mandatory</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>requisite</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>optional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='subsysType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pci</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>scsi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='capsType'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='pciBackend'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </hostdev>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <rng supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-non-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>random</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>egd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>builtin</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </rng>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <filesystem supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='driverType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>path</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>handle</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtiofs</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </filesystem>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <tpm supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tpm-tis</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tpm-crb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>emulator</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>external</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendVersion'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>2.0</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </tpm>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <redirdev supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='bus'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </redirdev>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <channel supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pty</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>unix</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </channel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <crypto supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>qemu</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>builtin</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </crypto>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <interface supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>default</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>passt</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </interface>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <panic supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>isa</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>hyperv</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </panic>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <console supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>null</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pty</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dev</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>file</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pipe</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>stdio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>udp</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tcp</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>unix</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>qemu-vdagent</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dbus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </console>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </devices>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <gic supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <vmcoreinfo supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <genid supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <backingStoreInput supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <backup supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <async-teardown supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <ps2 supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <sev supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <sgx supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <hyperv supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='features'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>relaxed</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vapic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>spinlocks</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vpindex</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>runtime</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>synic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>stimer</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>reset</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vendor_id</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>frequencies</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>reenlightenment</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tlbflush</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ipi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>avic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>emsr_bitmap</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>xmm_input</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <defaults>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <spinlocks>4095</spinlocks>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <stimer_direct>on</stimer_direct>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </defaults>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </hyperv>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <launchSecurity supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='sectype'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tdx</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </launchSecurity>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </features>
Nov 25 18:59:10 compute-0 nova_compute[187212]: </domainCapabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.638 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 25 18:59:10 compute-0 nova_compute[187212]: <domainCapabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <domain>kvm</domain>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <arch>i686</arch>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <vcpu max='4096'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <iothreads supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <os supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <enum name='firmware'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <loader supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>rom</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pflash</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='readonly'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>yes</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>no</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='secure'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>no</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </loader>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </os>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='host-passthrough' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='hostPassthroughMigratable'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>on</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>off</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='maximum' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='maximumMigratable'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>on</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>off</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='host-model' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <vendor>AMD</vendor>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='x2apic'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='hypervisor'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='stibp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='overflow-recov'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='succor'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='amd-ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='lbrv'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc-scale'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='flushbyasid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='pause-filter'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='pfthreshold'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='svme-addr-chk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='disable' name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='custom' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Dhyana-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Genoa'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='auto-ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='auto-ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-128'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-256'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-512'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v6'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v7'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='KnightsMill'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512er'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512pf'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='KnightsMill-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512er'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512pf'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G4-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tbm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G5-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tbm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SierraForest'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cmpccxadd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SierraForest-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cmpccxadd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='athlon'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='athlon-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='core2duo'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='core2duo-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='coreduo'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='coreduo-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='n270'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='n270-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='phenom'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='phenom-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <memoryBacking supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <enum name='sourceType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>file</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>anonymous</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>memfd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </memoryBacking>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <devices>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <disk supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='diskDevice'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>disk</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>cdrom</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>floppy</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>lun</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='bus'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>fdc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>scsi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>sata</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-non-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </disk>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <graphics supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vnc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>egl-headless</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dbus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </graphics>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <video supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='modelType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vga</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>cirrus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>none</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>bochs</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ramfb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </video>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <hostdev supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='mode'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>subsystem</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='startupPolicy'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>default</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>mandatory</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>requisite</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>optional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='subsysType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pci</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>scsi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='capsType'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='pciBackend'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </hostdev>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <rng supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-non-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>random</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>egd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>builtin</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </rng>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <filesystem supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='driverType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>path</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>handle</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtiofs</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </filesystem>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <tpm supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tpm-tis</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tpm-crb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>emulator</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>external</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendVersion'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>2.0</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </tpm>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <redirdev supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='bus'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </redirdev>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <channel supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pty</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>unix</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </channel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <crypto supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>qemu</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>builtin</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </crypto>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <interface supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>default</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>passt</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </interface>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <panic supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>isa</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>hyperv</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </panic>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <console supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>null</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pty</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dev</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>file</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pipe</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>stdio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>udp</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tcp</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>unix</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>qemu-vdagent</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dbus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </console>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </devices>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <gic supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <vmcoreinfo supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <genid supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <backingStoreInput supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <backup supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <async-teardown supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <ps2 supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <sev supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <sgx supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <hyperv supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='features'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>relaxed</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vapic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>spinlocks</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vpindex</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>runtime</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>synic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>stimer</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>reset</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vendor_id</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>frequencies</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>reenlightenment</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tlbflush</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ipi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>avic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>emsr_bitmap</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>xmm_input</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <defaults>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <spinlocks>4095</spinlocks>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <stimer_direct>on</stimer_direct>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </defaults>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </hyperv>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <launchSecurity supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='sectype'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tdx</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </launchSecurity>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </features>
Nov 25 18:59:10 compute-0 nova_compute[187212]: </domainCapabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.688 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.695 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 25 18:59:10 compute-0 nova_compute[187212]: <domainCapabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <domain>kvm</domain>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <arch>x86_64</arch>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <vcpu max='240'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <iothreads supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <os supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <enum name='firmware'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <loader supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>rom</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pflash</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='readonly'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>yes</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>no</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='secure'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>no</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </loader>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </os>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='host-passthrough' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='hostPassthroughMigratable'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>on</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>off</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='maximum' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='maximumMigratable'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>on</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>off</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='host-model' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <vendor>AMD</vendor>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='x2apic'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='hypervisor'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='stibp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='overflow-recov'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='succor'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='amd-ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='lbrv'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc-scale'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='flushbyasid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='pause-filter'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='pfthreshold'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='svme-addr-chk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='disable' name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='custom' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Dhyana-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Genoa'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='auto-ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='auto-ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-128'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-256'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-512'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v6'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v7'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='KnightsMill'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512er'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512pf'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='KnightsMill-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512er'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512pf'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G4-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tbm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G5-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tbm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SierraForest'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cmpccxadd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SierraForest-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cmpccxadd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='athlon'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='athlon-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='core2duo'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='core2duo-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='coreduo'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='coreduo-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='n270'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='n270-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='phenom'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='phenom-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <memoryBacking supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <enum name='sourceType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>file</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>anonymous</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>memfd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </memoryBacking>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <devices>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <disk supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='diskDevice'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>disk</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>cdrom</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>floppy</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>lun</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='bus'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ide</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>fdc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>scsi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>sata</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-non-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </disk>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <graphics supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vnc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>egl-headless</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dbus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </graphics>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <video supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='modelType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vga</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>cirrus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>none</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>bochs</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ramfb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </video>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <hostdev supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='mode'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>subsystem</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='startupPolicy'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>default</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>mandatory</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>requisite</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>optional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='subsysType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pci</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>scsi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='capsType'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='pciBackend'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </hostdev>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <rng supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-non-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>random</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>egd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>builtin</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </rng>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <filesystem supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='driverType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>path</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>handle</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtiofs</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </filesystem>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <tpm supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tpm-tis</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tpm-crb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>emulator</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>external</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendVersion'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>2.0</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </tpm>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <redirdev supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='bus'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </redirdev>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <channel supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pty</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>unix</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </channel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <crypto supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>qemu</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>builtin</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </crypto>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <interface supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>default</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>passt</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </interface>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <panic supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>isa</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>hyperv</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </panic>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <console supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>null</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pty</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dev</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>file</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pipe</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>stdio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>udp</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tcp</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>unix</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>qemu-vdagent</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dbus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </console>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </devices>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <gic supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <vmcoreinfo supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <genid supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <backingStoreInput supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <backup supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <async-teardown supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <ps2 supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <sev supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <sgx supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <hyperv supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='features'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>relaxed</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vapic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>spinlocks</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vpindex</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>runtime</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>synic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>stimer</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>reset</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vendor_id</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>frequencies</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>reenlightenment</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tlbflush</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ipi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>avic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>emsr_bitmap</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>xmm_input</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <defaults>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <spinlocks>4095</spinlocks>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <stimer_direct>on</stimer_direct>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </defaults>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </hyperv>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <launchSecurity supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='sectype'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tdx</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </launchSecurity>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </features>
Nov 25 18:59:10 compute-0 nova_compute[187212]: </domainCapabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.756 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 25 18:59:10 compute-0 nova_compute[187212]: <domainCapabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <path>/usr/libexec/qemu-kvm</path>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <domain>kvm</domain>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <arch>x86_64</arch>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <vcpu max='4096'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <iothreads supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <os supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <enum name='firmware'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>efi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <loader supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>rom</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pflash</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='readonly'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>yes</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>no</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='secure'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>yes</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>no</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </loader>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </os>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='host-passthrough' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='hostPassthroughMigratable'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>on</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>off</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='maximum' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='maximumMigratable'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>on</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>off</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='host-model' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <vendor>AMD</vendor>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='x2apic'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc-deadline'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='hypervisor'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc_adjust'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='spec-ctrl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='stibp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='cmp_legacy'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='overflow-recov'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='succor'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='amd-ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='virt-ssbd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='lbrv'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='tsc-scale'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='vmcb-clean'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='flushbyasid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='pause-filter'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='pfthreshold'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='svme-addr-chk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <feature policy='disable' name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <mode name='custom' supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Broadwell-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cascadelake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Cooperlake-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Denverton-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Dhyana-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Genoa'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='auto-ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Genoa-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='auto-ibrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Milan-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amd-psfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='no-nested-data-bp'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='null-sel-clr-base'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='stibp-always-on'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-Rome-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='EPYC-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='GraniteRapids-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-128'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-256'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx10-512'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='prefetchiti'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Haswell-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-noTSX'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v6'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Icelake-Server-v7'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='IvyBridge-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='KnightsMill'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512er'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512pf'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='KnightsMill-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4fmaps'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-4vnniw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512er'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512pf'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G4-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tbm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Opteron_G5-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fma4'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tbm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xop'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SapphireRapids-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='amx-tile'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-bf16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-fp16'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512-vpopcntdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bitalg'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vbmi2'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrc'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fzrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='la57'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='taa-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='tsx-ldtrk'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xfd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SierraForest'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cmpccxadd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='SierraForest-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ifma'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-ne-convert'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx-vnni-int8'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='bus-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cmpccxadd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fbsdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='fsrs'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ibrs-all'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mcdt-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pbrsb-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='psdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='sbdr-ssdp-no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='serialize'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vaes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='vpclmulqdq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Client-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='hle'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='rtm'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Skylake-Server-v5'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512bw'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512cd'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512dq'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512f'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='avx512vl'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='invpcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pcid'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='pku'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='mpx'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v2'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v3'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='core-capability'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='split-lock-detect'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='Snowridge-v4'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='cldemote'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='erms'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='gfni'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdir64b'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='movdiri'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='xsaves'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='athlon'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='athlon-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='core2duo'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='core2duo-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='coreduo'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='coreduo-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='n270'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='n270-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='ss'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='phenom'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <blockers model='phenom-v1'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnow'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <feature name='3dnowext'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </blockers>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </mode>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <memoryBacking supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <enum name='sourceType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>file</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>anonymous</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <value>memfd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </memoryBacking>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <devices>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <disk supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='diskDevice'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>disk</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>cdrom</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>floppy</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>lun</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='bus'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>fdc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>scsi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>sata</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-non-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </disk>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <graphics supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vnc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>egl-headless</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dbus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </graphics>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <video supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='modelType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vga</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>cirrus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>none</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>bochs</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ramfb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </video>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <hostdev supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='mode'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>subsystem</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='startupPolicy'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>default</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>mandatory</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>requisite</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>optional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='subsysType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pci</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>scsi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='capsType'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='pciBackend'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </hostdev>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <rng supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtio-non-transitional</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>random</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>egd</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>builtin</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </rng>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <filesystem supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='driverType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>path</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>handle</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>virtiofs</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </filesystem>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <tpm supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tpm-tis</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tpm-crb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>emulator</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>external</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendVersion'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>2.0</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </tpm>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <redirdev supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='bus'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>usb</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </redirdev>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <channel supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pty</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>unix</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </channel>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <crypto supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>qemu</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendModel'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>builtin</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </crypto>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <interface supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='backendType'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>default</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>passt</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </interface>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <panic supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='model'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>isa</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>hyperv</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </panic>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <console supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='type'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>null</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vc</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pty</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dev</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>file</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>pipe</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>stdio</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>udp</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tcp</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>unix</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>qemu-vdagent</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>dbus</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </console>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </devices>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <features>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <gic supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <vmcoreinfo supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <genid supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <backingStoreInput supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <backup supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <async-teardown supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <ps2 supported='yes'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <sev supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <sgx supported='no'/>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <hyperv supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='features'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>relaxed</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vapic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>spinlocks</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vpindex</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>runtime</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>synic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>stimer</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>reset</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>vendor_id</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>frequencies</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>reenlightenment</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tlbflush</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>ipi</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>avic</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>emsr_bitmap</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>xmm_input</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <defaults>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <spinlocks>4095</spinlocks>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <stimer_direct>on</stimer_direct>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <tlbflush_direct>on</tlbflush_direct>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <tlbflush_extended>on</tlbflush_extended>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </defaults>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </hyperv>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     <launchSecurity supported='yes'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       <enum name='sectype'>
Nov 25 18:59:10 compute-0 nova_compute[187212]:         <value>tdx</value>
Nov 25 18:59:10 compute-0 nova_compute[187212]:       </enum>
Nov 25 18:59:10 compute-0 nova_compute[187212]:     </launchSecurity>
Nov 25 18:59:10 compute-0 nova_compute[187212]:   </features>
Nov 25 18:59:10 compute-0 nova_compute[187212]: </domainCapabilities>
Nov 25 18:59:10 compute-0 nova_compute[187212]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.818 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.819 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.819 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.819 187216 INFO nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Secure Boot support detected
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.827 187216 INFO nova.virt.libvirt.driver [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.827 187216 INFO nova.virt.libvirt.driver [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.847 187216 DEBUG nova.virt.libvirt.driver [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 25 18:59:10 compute-0 nova_compute[187212]:   <model>Nehalem</model>
Nov 25 18:59:10 compute-0 nova_compute[187212]: </cpu>
Nov 25 18:59:10 compute-0 nova_compute[187212]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Nov 25 18:59:10 compute-0 nova_compute[187212]: 2025-11-25 18:59:10.850 187216 DEBUG nova.virt.libvirt.driver [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Nov 25 18:59:11 compute-0 nova_compute[187212]: 2025-11-25 18:59:11.106 187216 WARNING nova.virt.libvirt.driver [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 25 18:59:11 compute-0 nova_compute[187212]: 2025-11-25 18:59:11.106 187216 DEBUG nova.virt.libvirt.volume.mount [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 25 18:59:11 compute-0 nova_compute[187212]: 2025-11-25 18:59:11.361 187216 INFO nova.virt.node [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Determined node identity bd855788-e41f-445a-8ef6-eb363fed2f12 from /var/lib/nova/compute_id
Nov 25 18:59:11 compute-0 nova_compute[187212]: 2025-11-25 18:59:11.870 187216 WARNING nova.compute.manager [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Compute nodes ['bd855788-e41f-445a-8ef6-eb363fed2f12'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 25 18:59:12 compute-0 nova_compute[187212]: 2025-11-25 18:59:12.883 187216 INFO nova.compute.manager [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 25 18:59:13 compute-0 sshd-session[187523]: Accepted publickey for zuul from 192.168.122.30 port 41762 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 18:59:13 compute-0 systemd-logind[820]: New session 26 of user zuul.
Nov 25 18:59:13 compute-0 systemd[1]: Started Session 26 of User zuul.
Nov 25 18:59:13 compute-0 sshd-session[187523]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 18:59:13 compute-0 nova_compute[187212]: 2025-11-25 18:59:13.903 187216 WARNING nova.compute.manager [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 25 18:59:13 compute-0 nova_compute[187212]: 2025-11-25 18:59:13.904 187216 DEBUG oslo_concurrency.lockutils [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:59:13 compute-0 nova_compute[187212]: 2025-11-25 18:59:13.904 187216 DEBUG oslo_concurrency.lockutils [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:59:13 compute-0 nova_compute[187212]: 2025-11-25 18:59:13.904 187216 DEBUG oslo_concurrency.lockutils [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:59:13 compute-0 nova_compute[187212]: 2025-11-25 18:59:13.905 187216 DEBUG nova.compute.resource_tracker [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 18:59:14 compute-0 nova_compute[187212]: 2025-11-25 18:59:14.130 187216 WARNING nova.virt.libvirt.driver [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 18:59:14 compute-0 nova_compute[187212]: 2025-11-25 18:59:14.132 187216 DEBUG oslo_concurrency.processutils [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 18:59:14 compute-0 nova_compute[187212]: 2025-11-25 18:59:14.160 187216 DEBUG oslo_concurrency.processutils [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 18:59:14 compute-0 nova_compute[187212]: 2025-11-25 18:59:14.162 187216 DEBUG nova.compute.resource_tracker [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6218MB free_disk=73.19866561889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 18:59:14 compute-0 nova_compute[187212]: 2025-11-25 18:59:14.162 187216 DEBUG oslo_concurrency.lockutils [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:59:14 compute-0 nova_compute[187212]: 2025-11-25 18:59:14.163 187216 DEBUG oslo_concurrency.lockutils [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:59:14 compute-0 nova_compute[187212]: 2025-11-25 18:59:14.672 187216 WARNING nova.compute.resource_tracker [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] No compute node record for compute-0.ctlplane.example.com:bd855788-e41f-445a-8ef6-eb363fed2f12: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host bd855788-e41f-445a-8ef6-eb363fed2f12 could not be found.
Nov 25 18:59:14 compute-0 python3.9[187677]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 25 18:59:15 compute-0 nova_compute[187212]: 2025-11-25 18:59:15.183 187216 INFO nova.compute.resource_tracker [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: bd855788-e41f-445a-8ef6-eb363fed2f12
Nov 25 18:59:16 compute-0 sudo[187848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-misjxoxaimahhdnhysahndwvbsivuvxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097155.488593-52-260195983535431/AnsiballZ_systemd_service.py'
Nov 25 18:59:16 compute-0 podman[187801]: 2025-11-25 18:59:16.202223781 +0000 UTC m=+0.113444930 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 18:59:16 compute-0 sudo[187848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:16 compute-0 python3.9[187853]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:59:16 compute-0 systemd[1]: Reloading.
Nov 25 18:59:16 compute-0 systemd-rc-local-generator[187879]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:59:16 compute-0 systemd-sysv-generator[187884]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:59:16 compute-0 nova_compute[187212]: 2025-11-25 18:59:16.711 187216 DEBUG nova.compute.resource_tracker [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 18:59:16 compute-0 nova_compute[187212]: 2025-11-25 18:59:16.712 187216 DEBUG nova.compute.resource_tracker [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 18:59:14 up 51 min,  0 user,  load average: 0.76, 0.84, 0.67\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 18:59:16 compute-0 sudo[187848]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:17 compute-0 nova_compute[187212]: 2025-11-25 18:59:17.558 187216 INFO nova.scheduler.client.report [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] [req-48b2533c-3cc0-442f-b491-709650e5a19a] Created resource provider record via placement API for resource provider with UUID bd855788-e41f-445a-8ef6-eb363fed2f12 and name compute-0.ctlplane.example.com.
Nov 25 18:59:17 compute-0 nova_compute[187212]: 2025-11-25 18:59:17.585 187216 DEBUG nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 25 18:59:17 compute-0 nova_compute[187212]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Nov 25 18:59:17 compute-0 nova_compute[187212]: 2025-11-25 18:59:17.585 187216 INFO nova.virt.libvirt.host [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] kernel doesn't support AMD SEV
Nov 25 18:59:17 compute-0 nova_compute[187212]: 2025-11-25 18:59:17.586 187216 DEBUG nova.compute.provider_tree [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 18:59:17 compute-0 nova_compute[187212]: 2025-11-25 18:59:17.587 187216 DEBUG nova.virt.libvirt.driver [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 18:59:17 compute-0 nova_compute[187212]: 2025-11-25 18:59:17.591 187216 DEBUG nova.virt.libvirt.driver [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Libvirt baseline CPU <cpu>
Nov 25 18:59:17 compute-0 nova_compute[187212]:   <arch>x86_64</arch>
Nov 25 18:59:17 compute-0 nova_compute[187212]:   <model>Nehalem</model>
Nov 25 18:59:17 compute-0 nova_compute[187212]:   <vendor>AMD</vendor>
Nov 25 18:59:17 compute-0 nova_compute[187212]:   <topology sockets="8" cores="1" threads="1"/>
Nov 25 18:59:17 compute-0 nova_compute[187212]:   <maxphysaddr mode="emulate" bits="40"/>
Nov 25 18:59:17 compute-0 nova_compute[187212]: </cpu>
Nov 25 18:59:17 compute-0 nova_compute[187212]:  _get_guest_baseline_cpu_features /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13545
Nov 25 18:59:17 compute-0 python3.9[188038]: ansible-ansible.builtin.service_facts Invoked
Nov 25 18:59:17 compute-0 network[188055]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 25 18:59:17 compute-0 network[188056]: 'network-scripts' will be removed from distribution in near future.
Nov 25 18:59:17 compute-0 network[188057]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 25 18:59:18 compute-0 nova_compute[187212]: 2025-11-25 18:59:18.194 187216 DEBUG nova.scheduler.client.report [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Updated inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Nov 25 18:59:18 compute-0 nova_compute[187212]: 2025-11-25 18:59:18.195 187216 DEBUG nova.compute.provider_tree [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Updating resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Nov 25 18:59:18 compute-0 nova_compute[187212]: 2025-11-25 18:59:18.195 187216 DEBUG nova.compute.provider_tree [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 18:59:18 compute-0 nova_compute[187212]: 2025-11-25 18:59:18.293 187216 DEBUG nova.compute.provider_tree [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Updating resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Nov 25 18:59:18 compute-0 nova_compute[187212]: 2025-11-25 18:59:18.803 187216 DEBUG nova.compute.resource_tracker [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 18:59:18 compute-0 nova_compute[187212]: 2025-11-25 18:59:18.804 187216 DEBUG oslo_concurrency.lockutils [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.641s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:59:18 compute-0 nova_compute[187212]: 2025-11-25 18:59:18.805 187216 DEBUG nova.service [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Nov 25 18:59:18 compute-0 nova_compute[187212]: 2025-11-25 18:59:18.898 187216 DEBUG nova.service [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Nov 25 18:59:18 compute-0 nova_compute[187212]: 2025-11-25 18:59:18.899 187216 DEBUG nova.servicegroup.drivers.db [None req-b1f41a60-da6d-45d1-832c-0b489e4a60a3 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Nov 25 18:59:23 compute-0 sudo[188329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvxgfvukukmmbeglmatwokfcvaoeogby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097162.8832731-90-219704285733288/AnsiballZ_systemd_service.py'
Nov 25 18:59:23 compute-0 sudo[188329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:23 compute-0 python3.9[188331]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 18:59:23 compute-0 sudo[188329]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:24 compute-0 sudo[188482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skqmqihdorlksvwilgvgxbxepgplynrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097164.0709145-110-114265136141478/AnsiballZ_file.py'
Nov 25 18:59:24 compute-0 sudo[188482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:24 compute-0 python3.9[188484]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:24 compute-0 sudo[188482]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:24 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 18:59:25 compute-0 sudo[188635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpqxbshsfzwtqhpzoetmcukpvbhjbjlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097165.0847762-126-7250480529528/AnsiballZ_file.py'
Nov 25 18:59:25 compute-0 sudo[188635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:25 compute-0 python3.9[188637]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:25 compute-0 sudo[188635]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:26 compute-0 sudo[188787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvzzpwmaypapicmcmovnlugjdscbmnvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097166.0829153-144-58370133768536/AnsiballZ_command.py'
Nov 25 18:59:26 compute-0 sudo[188787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:26 compute-0 python3.9[188789]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:59:26 compute-0 sudo[188787]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:27 compute-0 python3.9[188941]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 18:59:27 compute-0 nova_compute[187212]: 2025-11-25 18:59:27.902 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 18:59:28 compute-0 nova_compute[187212]: 2025-11-25 18:59:28.417 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 18:59:28 compute-0 sudo[189091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcugkidqyriduhmtjgjjepttpqkeweox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097168.248011-180-30441999492093/AnsiballZ_systemd_service.py'
Nov 25 18:59:28 compute-0 sudo[189091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:29 compute-0 python3.9[189093]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 18:59:29 compute-0 systemd[1]: Reloading.
Nov 25 18:59:29 compute-0 systemd-rc-local-generator[189119]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 18:59:29 compute-0 systemd-sysv-generator[189125]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 18:59:29 compute-0 sudo[189091]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:29 compute-0 sudo[189278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcljgqqjrylcfswbrdmfbdczjggfwinc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097169.5958743-196-143223777408336/AnsiballZ_command.py'
Nov 25 18:59:29 compute-0 sudo[189278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:30 compute-0 python3.9[189280]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 18:59:30 compute-0 sudo[189278]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:30 compute-0 sudo[189431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvktliewaywnpgfwpbaftnnvsvtervxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097170.6017225-214-32054202115400/AnsiballZ_file.py'
Nov 25 18:59:31 compute-0 sudo[189431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:59:31.050 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 18:59:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:59:31.052 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 18:59:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 18:59:31.052 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 18:59:31 compute-0 python3.9[189433]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:59:31 compute-0 sudo[189431]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:33 compute-0 python3.9[189584]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:59:34 compute-0 python3.9[189736]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:35 compute-0 podman[189831]: 2025-11-25 18:59:35.632544684 +0000 UTC m=+0.140369577 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 25 18:59:35 compute-0 python3.9[189868]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097174.3055866-246-27427920998622/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 25 18:59:36 compute-0 sudo[190034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tthuqhdcyytnmgxlazowpkfckqsmghxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097175.9887245-276-106963625831833/AnsiballZ_group.py'
Nov 25 18:59:36 compute-0 sudo[190034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:36 compute-0 python3.9[190036]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 25 18:59:36 compute-0 sudo[190034]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:37 compute-0 sudo[190199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpjahvmmugkyiqobcxzislsjeygiatwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097177.2387402-298-96499628655490/AnsiballZ_getent.py'
Nov 25 18:59:37 compute-0 sudo[190199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:37 compute-0 podman[190160]: 2025-11-25 18:59:37.901931067 +0000 UTC m=+0.087035416 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Nov 25 18:59:38 compute-0 python3.9[190205]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 25 18:59:38 compute-0 sudo[190199]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:38 compute-0 sudo[190356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxodhpvwijkqyzpvslawntzpoohwqmef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097178.369985-314-219259639565626/AnsiballZ_group.py'
Nov 25 18:59:38 compute-0 sudo[190356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:38 compute-0 python3.9[190358]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 25 18:59:38 compute-0 groupadd[190359]: group added to /etc/group: name=ceilometer, GID=42405
Nov 25 18:59:39 compute-0 groupadd[190359]: group added to /etc/gshadow: name=ceilometer
Nov 25 18:59:39 compute-0 groupadd[190359]: new group: name=ceilometer, GID=42405
Nov 25 18:59:39 compute-0 sudo[190356]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:39 compute-0 sudo[190514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czcvffetqxpbcjkvbopbvbwattxvfrsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097179.3295362-330-52828539424394/AnsiballZ_user.py'
Nov 25 18:59:39 compute-0 sudo[190514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 18:59:40 compute-0 python3.9[190516]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 25 18:59:40 compute-0 useradd[190518]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 25 18:59:40 compute-0 useradd[190518]: add 'ceilometer' to group 'libvirt'
Nov 25 18:59:40 compute-0 useradd[190518]: add 'ceilometer' to shadow group 'libvirt'
Nov 25 18:59:40 compute-0 sudo[190514]: pam_unix(sudo:session): session closed for user root
Nov 25 18:59:41 compute-0 python3.9[190674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:42 compute-0 python3.9[190795]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764097181.1258821-382-223728973765082/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:43 compute-0 python3.9[190945]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:43 compute-0 python3.9[191066]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764097182.5608435-382-86007074149925/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:44 compute-0 python3.9[191216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:45 compute-0 python3.9[191337]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764097183.9437072-382-266982600353416/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:46 compute-0 python3.9[191487]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:59:46 compute-0 podman[191613]: 2025-11-25 18:59:46.636223776 +0000 UTC m=+0.089162691 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 18:59:46 compute-0 python3.9[191652]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 18:59:47 compute-0 python3.9[191811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:48 compute-0 python3.9[191932]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097187.0582-500-37732657399544/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:49 compute-0 python3.9[192082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:49 compute-0 python3.9[192158]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:50 compute-0 python3.9[192308]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:51 compute-0 python3.9[192429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097190.1348646-500-53097255239521/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=1e04341d6be063104ba1c6c4b6bc412561525ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:52 compute-0 python3.9[192579]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:52 compute-0 python3.9[192700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097191.6203606-500-99324800662465/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:53 compute-0 python3.9[192850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:54 compute-0 python3.9[192971]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097193.0748737-500-58804900870153/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:55 compute-0 python3.9[193121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:55 compute-0 python3.9[193242]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097194.545272-500-142419171124595/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:56 compute-0 python3.9[193392]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:57 compute-0 python3.9[193513]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097195.9712753-500-161973663861802/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:57 compute-0 python3.9[193663]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 18:59:58 compute-0 python3.9[193784]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097197.4026268-500-149799853120373/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 18:59:59 compute-0 python3.9[193934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:00:00 compute-0 python3.9[194055]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097198.9317775-500-125449293372226/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:00 compute-0 python3.9[194205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:00:01 compute-0 python3.9[194326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097200.3539464-500-219129059622285/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:02 compute-0 python3.9[194476]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:00:03 compute-0 python3.9[194597]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097201.8749754-500-110677344460793/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:04 compute-0 python3.9[194747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:00:04 compute-0 python3.9[194823]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:05 compute-0 python3.9[194973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:00:05 compute-0 podman[195023]: 2025-11-25 19:00:05.860673105 +0000 UTC m=+0.174740527 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 25 19:00:05 compute-0 python3.9[195060]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:06 compute-0 python3.9[195228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:00:07 compute-0 python3.9[195304]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:08 compute-0 sudo[195467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bptfhxjgldfqxlgwlswhjvxvihorwfnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097207.736158-878-210578027228809/AnsiballZ_file.py'
Nov 25 19:00:08 compute-0 sudo[195467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:08 compute-0 podman[195428]: 2025-11-25 19:00:08.165795984 +0000 UTC m=+0.085434744 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.176 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.176 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.177 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.177 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.177 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.178 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.178 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.178 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.178 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:00:08 compute-0 python3.9[195475]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:08 compute-0 sudo[195467]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.698 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.699 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.700 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.700 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.894 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.895 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.913 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.914 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6164MB free_disk=73.19861602783203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.914 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:00:08 compute-0 nova_compute[187212]: 2025-11-25 19:00:08.914 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:00:09 compute-0 sudo[195626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abhqrwxfipwydncscgxiptlptbqkihwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097208.684845-894-188811769515223/AnsiballZ_file.py'
Nov 25 19:00:09 compute-0 sudo[195626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:09 compute-0 python3.9[195628]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:09 compute-0 sudo[195626]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:09 compute-0 sudo[195778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfgfbhvenxwmisrvbxkitfwcwfzmqdwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097209.5862098-910-133046840300160/AnsiballZ_file.py'
Nov 25 19:00:09 compute-0 sudo[195778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:09 compute-0 nova_compute[187212]: 2025-11-25 19:00:09.981 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:00:09 compute-0 nova_compute[187212]: 2025-11-25 19:00:09.981 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:00:08 up 52 min,  0 user,  load average: 0.94, 0.89, 0.70\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:00:10 compute-0 nova_compute[187212]: 2025-11-25 19:00:10.006 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:00:10 compute-0 python3.9[195780]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 25 19:00:10 compute-0 sudo[195778]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:10 compute-0 nova_compute[187212]: 2025-11-25 19:00:10.515 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:00:11 compute-0 nova_compute[187212]: 2025-11-25 19:00:11.026 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:00:11 compute-0 nova_compute[187212]: 2025-11-25 19:00:11.026 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:00:11 compute-0 sudo[195930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnaxwfuzozkacpkbiyxeuzjcbpahgemi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097210.4866233-926-53418762166560/AnsiballZ_systemd_service.py'
Nov 25 19:00:11 compute-0 sudo[195930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:11 compute-0 python3.9[195932]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 19:00:12 compute-0 systemd[1]: Reloading.
Nov 25 19:00:12 compute-0 systemd-rc-local-generator[195966]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 19:00:12 compute-0 systemd-sysv-generator[195969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 19:00:12 compute-0 systemd[1]: Listening on Podman API Socket.
Nov 25 19:00:12 compute-0 sudo[195930]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:13 compute-0 sudo[196122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiptmnjvzfjntozmiwpkiipjjmkeisrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097213.1929653-944-145860304333255/AnsiballZ_stat.py'
Nov 25 19:00:13 compute-0 sudo[196122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:13 compute-0 python3.9[196124]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:00:13 compute-0 sudo[196122]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:14 compute-0 sudo[196245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlfszhpdacplhprtxnijagywvaagjmqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097213.1929653-944-145860304333255/AnsiballZ_copy.py'
Nov 25 19:00:14 compute-0 sudo[196245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:14 compute-0 python3.9[196247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097213.1929653-944-145860304333255/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 19:00:14 compute-0 sudo[196245]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:15 compute-0 sudo[196397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeajllexdylbbrvtdfkmhchuktcdtqij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097214.9885035-978-259273730575235/AnsiballZ_container_config_data.py'
Nov 25 19:00:15 compute-0 sudo[196397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:15 compute-0 python3.9[196399]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 25 19:00:15 compute-0 sudo[196397]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:16 compute-0 sudo[196562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeubkammuuisrzabzfqeucoptzbowxcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097216.23122-996-52810649650507/AnsiballZ_container_config_hash.py'
Nov 25 19:00:16 compute-0 sudo[196562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:16 compute-0 podman[196523]: 2025-11-25 19:00:16.822035321 +0000 UTC m=+0.103898253 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 19:00:17 compute-0 python3.9[196570]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 19:00:17 compute-0 sudo[196562]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:18 compute-0 sudo[196721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbbihntfoegncbxtpmtgllmzzmahkslq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764097217.8020847-1016-143591846588894/AnsiballZ_edpm_container_manage.py'
Nov 25 19:00:18 compute-0 sudo[196721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:18 compute-0 python3[196723]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 19:00:20 compute-0 podman[196736]: 2025-11-25 19:00:20.125780855 +0000 UTC m=+1.272950543 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 25 19:00:20 compute-0 podman[196835]: 2025-11-25 19:00:20.33367409 +0000 UTC m=+0.083715489 container create e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible)
Nov 25 19:00:20 compute-0 podman[196835]: 2025-11-25 19:00:20.289011883 +0000 UTC m=+0.039053332 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 25 19:00:20 compute-0 python3[196723]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 25 19:00:20 compute-0 sudo[196721]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:21 compute-0 sudo[197023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogepbjtehibqrizmellrvnaofjkhwfjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097221.0715518-1032-199572039771764/AnsiballZ_stat.py'
Nov 25 19:00:21 compute-0 sudo[197023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:21 compute-0 python3.9[197025]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 19:00:21 compute-0 sudo[197023]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:22 compute-0 sudo[197177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaoglbwmbzkzivcgrmvwxaptcgrvvbxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097222.2530746-1050-280942647077286/AnsiballZ_file.py'
Nov 25 19:00:22 compute-0 sudo[197177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:22 compute-0 python3.9[197179]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:22 compute-0 sudo[197177]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:23 compute-0 sudo[197328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flgtxbricpqstjwgyzvdezdmlnhihnks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097222.9448912-1050-129863747396953/AnsiballZ_copy.py'
Nov 25 19:00:23 compute-0 sudo[197328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:23 compute-0 python3.9[197330]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764097222.9448912-1050-129863747396953/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:23 compute-0 sudo[197328]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:24 compute-0 sudo[197404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixlunvbnshrfskpyvjszisaodxtkclbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097222.9448912-1050-129863747396953/AnsiballZ_systemd.py'
Nov 25 19:00:24 compute-0 sudo[197404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:24 compute-0 python3.9[197406]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 19:00:24 compute-0 systemd[1]: Reloading.
Nov 25 19:00:24 compute-0 systemd-rc-local-generator[197436]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 19:00:24 compute-0 systemd-sysv-generator[197440]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 19:00:25 compute-0 sudo[197404]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:25 compute-0 sudo[197516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rujvfpslsvsqepxkqbtbmudcprnubild ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097222.9448912-1050-129863747396953/AnsiballZ_systemd.py'
Nov 25 19:00:25 compute-0 sudo[197516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:25 compute-0 python3.9[197518]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 19:00:25 compute-0 systemd[1]: Reloading.
Nov 25 19:00:25 compute-0 systemd-rc-local-generator[197548]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 19:00:25 compute-0 systemd-sysv-generator[197552]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 19:00:26 compute-0 systemd[1]: Starting podman_exporter container...
Nov 25 19:00:26 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:00:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c833ae51ba237ae50ea3d35037b5146dac66be50f95eae115a60443e88605401/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c833ae51ba237ae50ea3d35037b5146dac66be50f95eae115a60443e88605401/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:26 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f.
Nov 25 19:00:26 compute-0 podman[197558]: 2025-11-25 19:00:26.347325348 +0000 UTC m=+0.181688587 container init e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:00:26 compute-0 podman_exporter[197573]: ts=2025-11-25T19:00:26.372Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 25 19:00:26 compute-0 podman_exporter[197573]: ts=2025-11-25T19:00:26.372Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 25 19:00:26 compute-0 podman_exporter[197573]: ts=2025-11-25T19:00:26.372Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 25 19:00:26 compute-0 podman_exporter[197573]: ts=2025-11-25T19:00:26.372Z caller=handler.go:105 level=info collector=container
Nov 25 19:00:26 compute-0 podman[197558]: 2025-11-25 19:00:26.38794035 +0000 UTC m=+0.222303579 container start e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:00:26 compute-0 podman[197558]: podman_exporter
Nov 25 19:00:26 compute-0 systemd[1]: Starting Podman API Service...
Nov 25 19:00:26 compute-0 systemd[1]: Started Podman API Service.
Nov 25 19:00:26 compute-0 systemd[1]: Started podman_exporter container.
Nov 25 19:00:26 compute-0 sudo[197516]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:26 compute-0 podman[197585]: time="2025-11-25T19:00:26Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 25 19:00:26 compute-0 podman[197585]: time="2025-11-25T19:00:26Z" level=info msg="Setting parallel job count to 25"
Nov 25 19:00:26 compute-0 podman[197585]: time="2025-11-25T19:00:26Z" level=info msg="Using sqlite as database backend"
Nov 25 19:00:26 compute-0 podman[197585]: time="2025-11-25T19:00:26Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 25 19:00:26 compute-0 podman[197585]: time="2025-11-25T19:00:26Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 25 19:00:26 compute-0 podman[197585]: time="2025-11-25T19:00:26Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 25 19:00:26 compute-0 podman[197585]: @ - - [25/Nov/2025:19:00:26 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 25 19:00:26 compute-0 podman[197585]: time="2025-11-25T19:00:26Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:00:26 compute-0 podman[197585]: @ - - [25/Nov/2025:19:00:26 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14050 "" "Go-http-client/1.1"
Nov 25 19:00:26 compute-0 podman_exporter[197573]: ts=2025-11-25T19:00:26.493Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 25 19:00:26 compute-0 podman_exporter[197573]: ts=2025-11-25T19:00:26.495Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 25 19:00:26 compute-0 podman_exporter[197573]: ts=2025-11-25T19:00:26.495Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 25 19:00:26 compute-0 podman[197584]: 2025-11-25 19:00:26.519260302 +0000 UTC m=+0.113880701 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:00:26 compute-0 systemd[1]: e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f-5b476f25fe3c117c.service: Main process exited, code=exited, status=1/FAILURE
Nov 25 19:00:26 compute-0 systemd[1]: e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f-5b476f25fe3c117c.service: Failed with result 'exit-code'.
Nov 25 19:00:28 compute-0 sudo[197767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxuijgtqbxjmymusalyzkwcmxiqbmltr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097227.7415748-1098-119088338597785/AnsiballZ_systemd.py'
Nov 25 19:00:28 compute-0 sudo[197767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:28 compute-0 python3.9[197769]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 19:00:28 compute-0 systemd[1]: Stopping podman_exporter container...
Nov 25 19:00:28 compute-0 podman[197585]: @ - - [25/Nov/2025:19:00:26 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Nov 25 19:00:28 compute-0 systemd[1]: libpod-e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f.scope: Deactivated successfully.
Nov 25 19:00:28 compute-0 podman[197773]: 2025-11-25 19:00:28.630552259 +0000 UTC m=+0.063976318 container died e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:00:28 compute-0 systemd[1]: e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f-5b476f25fe3c117c.timer: Deactivated successfully.
Nov 25 19:00:28 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f.
Nov 25 19:00:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f-userdata-shm.mount: Deactivated successfully.
Nov 25 19:00:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-c833ae51ba237ae50ea3d35037b5146dac66be50f95eae115a60443e88605401-merged.mount: Deactivated successfully.
Nov 25 19:00:28 compute-0 podman[197773]: 2025-11-25 19:00:28.907400221 +0000 UTC m=+0.340824200 container cleanup e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:00:28 compute-0 podman[197773]: podman_exporter
Nov 25 19:00:28 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 25 19:00:28 compute-0 podman[197800]: podman_exporter
Nov 25 19:00:28 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 25 19:00:28 compute-0 systemd[1]: Stopped podman_exporter container.
Nov 25 19:00:29 compute-0 systemd[1]: Starting podman_exporter container...
Nov 25 19:00:29 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:00:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c833ae51ba237ae50ea3d35037b5146dac66be50f95eae115a60443e88605401/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c833ae51ba237ae50ea3d35037b5146dac66be50f95eae115a60443e88605401/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:29 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f.
Nov 25 19:00:29 compute-0 podman[197813]: 2025-11-25 19:00:29.196324224 +0000 UTC m=+0.158425484 container init e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:00:29 compute-0 podman_exporter[197829]: ts=2025-11-25T19:00:29.219Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 25 19:00:29 compute-0 podman_exporter[197829]: ts=2025-11-25T19:00:29.219Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 25 19:00:29 compute-0 podman_exporter[197829]: ts=2025-11-25T19:00:29.220Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 25 19:00:29 compute-0 podman_exporter[197829]: ts=2025-11-25T19:00:29.220Z caller=handler.go:105 level=info collector=container
Nov 25 19:00:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:00:29 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 25 19:00:29 compute-0 podman[197585]: time="2025-11-25T19:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:00:29 compute-0 podman[197813]: 2025-11-25 19:00:29.231068224 +0000 UTC m=+0.193169434 container start e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:00:29 compute-0 podman[197813]: podman_exporter
Nov 25 19:00:29 compute-0 systemd[1]: Started podman_exporter container.
Nov 25 19:00:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14052 "" "Go-http-client/1.1"
Nov 25 19:00:29 compute-0 podman_exporter[197829]: ts=2025-11-25T19:00:29.250Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 25 19:00:29 compute-0 podman_exporter[197829]: ts=2025-11-25T19:00:29.251Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 25 19:00:29 compute-0 podman_exporter[197829]: ts=2025-11-25T19:00:29.252Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 25 19:00:29 compute-0 sudo[197767]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:29 compute-0 podman[197838]: 2025-11-25 19:00:29.324451093 +0000 UTC m=+0.079297765 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:00:29 compute-0 sudo[198012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgfvtdbnlhlrojjlqehexixyozbtxend ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097229.554593-1114-186786733985471/AnsiballZ_stat.py'
Nov 25 19:00:29 compute-0 sudo[198012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:30 compute-0 python3.9[198014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:00:30 compute-0 sudo[198012]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:30 compute-0 sudo[198135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyapuxcfanummuntsxqfhhmukuypzyoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097229.554593-1114-186786733985471/AnsiballZ_copy.py'
Nov 25 19:00:30 compute-0 sudo[198135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:30 compute-0 python3.9[198137]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764097229.554593-1114-186786733985471/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 25 19:00:30 compute-0 sudo[198135]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:00:31.053 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:00:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:00:31.055 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:00:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:00:31.055 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:00:31 compute-0 sudo[198288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asslamjqmghkdchgrenzpdwkbfrxqyrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097231.345073-1148-112501632360884/AnsiballZ_container_config_data.py'
Nov 25 19:00:31 compute-0 sudo[198288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:31 compute-0 python3.9[198290]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 25 19:00:31 compute-0 sudo[198288]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:32 compute-0 sudo[198440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rluhxmtamlgkftdseumqkwoacfrzdlbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097232.2542417-1166-167354913892495/AnsiballZ_container_config_hash.py'
Nov 25 19:00:32 compute-0 sudo[198440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:32 compute-0 python3.9[198442]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 25 19:00:32 compute-0 sudo[198440]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:33 compute-0 sudo[198592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aijqftoejzcfopwezfjnvxeufkdohqda ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764097233.219836-1186-26269443789425/AnsiballZ_edpm_container_manage.py'
Nov 25 19:00:33 compute-0 sudo[198592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:33 compute-0 python3[198594]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 25 19:00:36 compute-0 podman[198650]: 2025-11-25 19:00:36.309251352 +0000 UTC m=+0.226941387 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 19:00:36 compute-0 podman[198606]: 2025-11-25 19:00:36.44980061 +0000 UTC m=+2.499266742 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 25 19:00:36 compute-0 podman[198731]: 2025-11-25 19:00:36.605248466 +0000 UTC m=+0.028711492 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 25 19:00:36 compute-0 podman[198731]: 2025-11-25 19:00:36.716840217 +0000 UTC m=+0.140303243 container create a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, release=1755695350, config_id=edpm, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public)
Nov 25 19:00:36 compute-0 python3[198594]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 25 19:00:36 compute-0 sudo[198592]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:37 compute-0 sudo[198919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggjskxahruyzdtqkiunlqbkkshfuttmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097237.1822534-1202-91328510467440/AnsiballZ_stat.py'
Nov 25 19:00:37 compute-0 sudo[198919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:37 compute-0 python3.9[198921]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 19:00:37 compute-0 sudo[198919]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:38 compute-0 podman[199047]: 2025-11-25 19:00:38.594645763 +0000 UTC m=+0.062083425 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 19:00:38 compute-0 sudo[199088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyaxguvzenwlmoumstyaoiklulwfopul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097238.2126734-1220-272228916071285/AnsiballZ_file.py'
Nov 25 19:00:38 compute-0 sudo[199088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:38 compute-0 python3.9[199094]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:38 compute-0 sudo[199088]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:39 compute-0 sudo[199243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exblqakdiyohrjpkdduidnvpihcfdxzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097238.890111-1220-59136472520429/AnsiballZ_copy.py'
Nov 25 19:00:39 compute-0 sudo[199243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:39 compute-0 python3.9[199245]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764097238.890111-1220-59136472520429/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:39 compute-0 sudo[199243]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:39 compute-0 sudo[199319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugxsmiiyyqhdpqsdljszchcwwzogrpxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097238.890111-1220-59136472520429/AnsiballZ_systemd.py'
Nov 25 19:00:39 compute-0 sudo[199319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:40 compute-0 python3.9[199321]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 25 19:00:40 compute-0 systemd[1]: Reloading.
Nov 25 19:00:40 compute-0 systemd-rc-local-generator[199350]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 19:00:40 compute-0 systemd-sysv-generator[199353]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 19:00:40 compute-0 sudo[199319]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:41 compute-0 sudo[199430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbajzvkpvmeefhbasdbdgvfydwhmulhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097238.890111-1220-59136472520429/AnsiballZ_systemd.py'
Nov 25 19:00:41 compute-0 sudo[199430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:41 compute-0 python3.9[199432]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 19:00:41 compute-0 systemd[1]: Reloading.
Nov 25 19:00:41 compute-0 systemd-rc-local-generator[199459]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 19:00:41 compute-0 systemd-sysv-generator[199465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 19:00:41 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 25 19:00:41 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:00:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e51df5e6812985d7f74a81280f3771a945033de7b993d8240bc92feb660713b0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e51df5e6812985d7f74a81280f3771a945033de7b993d8240bc92feb660713b0/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e51df5e6812985d7f74a81280f3771a945033de7b993d8240bc92feb660713b0/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:41 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e.
Nov 25 19:00:41 compute-0 podman[199472]: 2025-11-25 19:00:41.873802507 +0000 UTC m=+0.154750651 container init a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=)
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *bridge.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *coverage.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *datapath.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *iface.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *memory.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *ovnnorthd.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *ovn.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *ovsdbserver.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *pmd_perf.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *pmd_rxq.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: INFO    19:00:41 main.go:48: registering *vswitch.Collector
Nov 25 19:00:41 compute-0 openstack_network_exporter[199487]: NOTICE  19:00:41 main.go:76: listening on https://:9105/metrics
Nov 25 19:00:41 compute-0 podman[199472]: 2025-11-25 19:00:41.899183584 +0000 UTC m=+0.180131698 container start a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_id=edpm)
Nov 25 19:00:41 compute-0 podman[199472]: openstack_network_exporter
Nov 25 19:00:41 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 25 19:00:41 compute-0 sudo[199430]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:42 compute-0 podman[199498]: 2025-11-25 19:00:42.027179636 +0000 UTC m=+0.109720661 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Nov 25 19:00:42 compute-0 sudo[199670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxvfxynianjyhqtouuqwobzrkcwxekck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097242.4296165-1268-203976243148821/AnsiballZ_systemd.py'
Nov 25 19:00:42 compute-0 sudo[199670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:43 compute-0 python3.9[199672]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 19:00:43 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Nov 25 19:00:43 compute-0 systemd[1]: libpod-a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e.scope: Deactivated successfully.
Nov 25 19:00:43 compute-0 podman[199676]: 2025-11-25 19:00:43.238898709 +0000 UTC m=+0.065676594 container died a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:00:43 compute-0 systemd[1]: a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e-137918a18604cc24.timer: Deactivated successfully.
Nov 25 19:00:43 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e.
Nov 25 19:00:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e-userdata-shm.mount: Deactivated successfully.
Nov 25 19:00:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-e51df5e6812985d7f74a81280f3771a945033de7b993d8240bc92feb660713b0-merged.mount: Deactivated successfully.
Nov 25 19:00:43 compute-0 podman[199676]: 2025-11-25 19:00:43.945353269 +0000 UTC m=+0.772131154 container cleanup a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 19:00:43 compute-0 podman[199676]: openstack_network_exporter
Nov 25 19:00:43 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 25 19:00:44 compute-0 podman[199703]: openstack_network_exporter
Nov 25 19:00:44 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 25 19:00:44 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Nov 25 19:00:44 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 25 19:00:44 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:00:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e51df5e6812985d7f74a81280f3771a945033de7b993d8240bc92feb660713b0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e51df5e6812985d7f74a81280f3771a945033de7b993d8240bc92feb660713b0/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e51df5e6812985d7f74a81280f3771a945033de7b993d8240bc92feb660713b0/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 25 19:00:44 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e.
Nov 25 19:00:44 compute-0 podman[199716]: 2025-11-25 19:00:44.220705539 +0000 UTC m=+0.152052852 container init a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, version=9.6, release=1755695350, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal)
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *bridge.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *coverage.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *datapath.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *iface.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *memory.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *ovnnorthd.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *ovn.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *ovsdbserver.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *pmd_perf.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *pmd_rxq.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: INFO    19:00:44 main.go:48: registering *vswitch.Collector
Nov 25 19:00:44 compute-0 openstack_network_exporter[199731]: NOTICE  19:00:44 main.go:76: listening on https://:9105/metrics
Nov 25 19:00:44 compute-0 podman[199716]: 2025-11-25 19:00:44.248701254 +0000 UTC m=+0.180048497 container start a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9)
Nov 25 19:00:44 compute-0 podman[199716]: openstack_network_exporter
Nov 25 19:00:44 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 25 19:00:44 compute-0 sudo[199670]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:44 compute-0 podman[199741]: 2025-11-25 19:00:44.359995179 +0000 UTC m=+0.091704506 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 25 19:00:44 compute-0 sudo[199910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knagnwbwadnuhfkkvfywbwknfdwymtxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097244.5547886-1284-211782777063197/AnsiballZ_find.py'
Nov 25 19:00:44 compute-0 sudo[199910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:45 compute-0 python3.9[199912]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 25 19:00:45 compute-0 sudo[199910]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:46 compute-0 sudo[200062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztbdkxwztypknrcapzqknwaivolkzeio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097245.6372764-1303-277165519961897/AnsiballZ_podman_container_info.py'
Nov 25 19:00:46 compute-0 sudo[200062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:46 compute-0 python3.9[200064]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 25 19:00:46 compute-0 sudo[200062]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:47 compute-0 podman[200154]: 2025-11-25 19:00:47.139585427 +0000 UTC m=+0.069999168 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:00:47 compute-0 sudo[200248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqldcfbmedipvqyovsyoktbkufhwtllq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097246.758735-1311-214227308474869/AnsiballZ_podman_container_exec.py'
Nov 25 19:00:47 compute-0 sudo[200248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:47 compute-0 python3.9[200250]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:00:47 compute-0 systemd[1]: Started libpod-conmon-8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84.scope.
Nov 25 19:00:47 compute-0 podman[200251]: 2025-11-25 19:00:47.772155525 +0000 UTC m=+0.112773958 container exec 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:00:47 compute-0 podman[200251]: 2025-11-25 19:00:47.804202199 +0000 UTC m=+0.144820632 container exec_died 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 25 19:00:47 compute-0 systemd[1]: libpod-conmon-8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84.scope: Deactivated successfully.
Nov 25 19:00:47 compute-0 sudo[200248]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:48 compute-0 sudo[200433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yntngsqctyhpglkilwltfeqcuegqcnqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097248.095214-1319-203367804825518/AnsiballZ_podman_container_exec.py'
Nov 25 19:00:48 compute-0 sudo[200433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:48 compute-0 python3.9[200435]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:00:48 compute-0 systemd[1]: Started libpod-conmon-8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84.scope.
Nov 25 19:00:48 compute-0 podman[200436]: 2025-11-25 19:00:48.845624559 +0000 UTC m=+0.092663077 container exec 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Nov 25 19:00:48 compute-0 podman[200436]: 2025-11-25 19:00:48.879906262 +0000 UTC m=+0.126944790 container exec_died 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_id=ovn_controller, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Nov 25 19:00:48 compute-0 systemd[1]: libpod-conmon-8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84.scope: Deactivated successfully.
Nov 25 19:00:48 compute-0 sudo[200433]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:49 compute-0 sudo[200616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxywmdfgknbkrpryckivroutscsvgwqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097249.1462448-1327-149657819326208/AnsiballZ_file.py'
Nov 25 19:00:49 compute-0 sudo[200616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:50 compute-0 python3.9[200618]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:50 compute-0 sudo[200616]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:50 compute-0 sudo[200768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiahhnfsogqbqjfbjssllioiaruowxkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097250.4817991-1336-215137965913690/AnsiballZ_podman_container_info.py'
Nov 25 19:00:50 compute-0 sudo[200768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:51 compute-0 python3.9[200770]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 25 19:00:51 compute-0 sudo[200768]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:51 compute-0 sudo[200933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsnlrsjyovlzfhtezsbckhmceeoucosg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097251.4197638-1344-63450393388619/AnsiballZ_podman_container_exec.py'
Nov 25 19:00:51 compute-0 sudo[200933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:52 compute-0 python3.9[200935]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:00:52 compute-0 systemd[1]: Started libpod-conmon-954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec.scope.
Nov 25 19:00:52 compute-0 podman[200936]: 2025-11-25 19:00:52.139384494 +0000 UTC m=+0.083520116 container exec 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 25 19:00:52 compute-0 podman[200936]: 2025-11-25 19:00:52.17015646 +0000 UTC m=+0.114292092 container exec_died 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:00:52 compute-0 systemd[1]: libpod-conmon-954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec.scope: Deactivated successfully.
Nov 25 19:00:52 compute-0 sudo[200933]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:52 compute-0 sudo[201121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gydefpgwqplmmhahqcebheoznmmhwtof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097252.4281244-1352-136458644553466/AnsiballZ_podman_container_exec.py'
Nov 25 19:00:52 compute-0 sudo[201121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:53 compute-0 python3.9[201123]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:00:53 compute-0 systemd[1]: Started libpod-conmon-954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec.scope.
Nov 25 19:00:53 compute-0 podman[201124]: 2025-11-25 19:00:53.211642053 +0000 UTC m=+0.098715890 container exec 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 25 19:00:53 compute-0 podman[201124]: 2025-11-25 19:00:53.246012008 +0000 UTC m=+0.133085825 container exec_died 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 19:00:53 compute-0 systemd[1]: libpod-conmon-954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec.scope: Deactivated successfully.
Nov 25 19:00:53 compute-0 sudo[201121]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:53 compute-0 sudo[201305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaejxfuqejbffqetfbnclluhbvxjqtya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097253.4956675-1360-112682162531013/AnsiballZ_file.py'
Nov 25 19:00:53 compute-0 sudo[201305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:54 compute-0 python3.9[201307]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:54 compute-0 sudo[201305]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:54 compute-0 sudo[201457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctlosgojidcwufwjjyoimlevcmobmbfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097254.3771744-1369-17268925111643/AnsiballZ_podman_container_info.py'
Nov 25 19:00:54 compute-0 sudo[201457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:54 compute-0 python3.9[201459]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 25 19:00:55 compute-0 sudo[201457]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:55 compute-0 sudo[201623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rurzdzhqqptdlwddaigwulrdaeawjtps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097255.308853-1377-13454892331527/AnsiballZ_podman_container_exec.py'
Nov 25 19:00:55 compute-0 sudo[201623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:55 compute-0 python3.9[201625]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:00:55 compute-0 systemd[1]: Started libpod-conmon-1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562.scope.
Nov 25 19:00:56 compute-0 podman[201626]: 2025-11-25 19:00:56.013717047 +0000 UTC m=+0.090949319 container exec 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Nov 25 19:00:56 compute-0 podman[201626]: 2025-11-25 19:00:56.045324421 +0000 UTC m=+0.122556633 container exec_died 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 19:00:56 compute-0 systemd[1]: libpod-conmon-1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562.scope: Deactivated successfully.
Nov 25 19:00:56 compute-0 sudo[201623]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:56 compute-0 sudo[201807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhmzlbwmhkymwsljobdmyjlctgvcwnry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097256.3498478-1385-123826956490180/AnsiballZ_podman_container_exec.py'
Nov 25 19:00:56 compute-0 sudo[201807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:56 compute-0 python3.9[201809]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:00:57 compute-0 systemd[1]: Started libpod-conmon-1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562.scope.
Nov 25 19:00:57 compute-0 podman[201810]: 2025-11-25 19:00:57.04345161 +0000 UTC m=+0.099137119 container exec 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 19:00:57 compute-0 podman[201810]: 2025-11-25 19:00:57.076920446 +0000 UTC m=+0.132605955 container exec_died 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 19:00:57 compute-0 systemd[1]: libpod-conmon-1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562.scope: Deactivated successfully.
Nov 25 19:00:57 compute-0 sudo[201807]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:57 compute-0 sudo[201992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbuawyxmxnhaqatdmmujjxesockcvimz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097257.359444-1393-66481780447944/AnsiballZ_file.py'
Nov 25 19:00:57 compute-0 sudo[201992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:57 compute-0 python3.9[201994]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:00:58 compute-0 sudo[201992]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:58 compute-0 sudo[202144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxownxvocdrhredwdajzqiqynbdpbcke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097258.2977734-1402-138570894567567/AnsiballZ_podman_container_info.py'
Nov 25 19:00:58 compute-0 sudo[202144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:58 compute-0 python3.9[202146]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 25 19:00:58 compute-0 sudo[202144]: pam_unix(sudo:session): session closed for user root
Nov 25 19:00:59 compute-0 auditd[704]: Audit daemon rotating log files
Nov 25 19:00:59 compute-0 sudo[202322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phiyyiweonakgssgotnrjwfwedwhfoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097259.2249525-1410-112219432858066/AnsiballZ_podman_container_exec.py'
Nov 25 19:00:59 compute-0 sudo[202322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:00:59 compute-0 podman[202283]: 2025-11-25 19:00:59.684135267 +0000 UTC m=+0.087228108 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:00:59 compute-0 python3.9[202330]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:00:59 compute-0 systemd[1]: Started libpod-conmon-e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f.scope.
Nov 25 19:01:00 compute-0 podman[202336]: 2025-11-25 19:01:00.014823602 +0000 UTC m=+0.101914560 container exec e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:01:00 compute-0 podman[202336]: 2025-11-25 19:01:00.046023168 +0000 UTC m=+0.133114056 container exec_died e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:01:00 compute-0 systemd[1]: libpod-conmon-e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f.scope: Deactivated successfully.
Nov 25 19:01:00 compute-0 sudo[202322]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:00 compute-0 sudo[202518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snkmqygaixoadzczhtazwaxmsbsztvba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097260.3468153-1418-81747244676206/AnsiballZ_podman_container_exec.py'
Nov 25 19:01:00 compute-0 sudo[202518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:01 compute-0 CROND[202522]: (root) CMD (run-parts /etc/cron.hourly)
Nov 25 19:01:01 compute-0 run-parts[202525]: (/etc/cron.hourly) starting 0anacron
Nov 25 19:01:01 compute-0 anacron[202533]: Anacron started on 2025-11-25
Nov 25 19:01:01 compute-0 python3.9[202520]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:01:01 compute-0 anacron[202533]: Will run job `cron.daily' in 35 min.
Nov 25 19:01:01 compute-0 anacron[202533]: Will run job `cron.weekly' in 55 min.
Nov 25 19:01:01 compute-0 anacron[202533]: Will run job `cron.monthly' in 75 min.
Nov 25 19:01:01 compute-0 anacron[202533]: Jobs will be executed sequentially
Nov 25 19:01:01 compute-0 run-parts[202536]: (/etc/cron.hourly) finished 0anacron
Nov 25 19:01:01 compute-0 CROND[202521]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 25 19:01:01 compute-0 systemd[1]: Started libpod-conmon-e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f.scope.
Nov 25 19:01:01 compute-0 podman[202535]: 2025-11-25 19:01:01.327981183 +0000 UTC m=+0.111824018 container exec e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:01:01 compute-0 podman[202535]: 2025-11-25 19:01:01.364063546 +0000 UTC m=+0.147906331 container exec_died e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:01:01 compute-0 systemd[1]: libpod-conmon-e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f.scope: Deactivated successfully.
Nov 25 19:01:01 compute-0 sudo[202518]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:02 compute-0 sudo[202715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eckntfwrxzfpnprtsyuybikglxwcznwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097261.668329-1426-56988709613267/AnsiballZ_file.py'
Nov 25 19:01:02 compute-0 sudo[202715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:02 compute-0 python3.9[202717]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:02 compute-0 sudo[202715]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:03 compute-0 sudo[202867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqthlbcblnnjxjcbiroufmmnvhshjfyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097262.5949504-1435-26894452944720/AnsiballZ_podman_container_info.py'
Nov 25 19:01:03 compute-0 sudo[202867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:03 compute-0 python3.9[202869]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 25 19:01:03 compute-0 sudo[202867]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:03 compute-0 sudo[203032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwkapynilbfyxktdormhxewudqccltol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097263.6106331-1443-108968956635132/AnsiballZ_podman_container_exec.py'
Nov 25 19:01:03 compute-0 sudo[203032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:04 compute-0 python3.9[203034]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:01:04 compute-0 systemd[1]: Started libpod-conmon-a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e.scope.
Nov 25 19:01:04 compute-0 podman[203035]: 2025-11-25 19:01:04.33100641 +0000 UTC m=+0.090787715 container exec a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Nov 25 19:01:04 compute-0 podman[203035]: 2025-11-25 19:01:04.33829698 +0000 UTC m=+0.098078215 container exec_died a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Nov 25 19:01:04 compute-0 sudo[203032]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:04 compute-0 systemd[1]: libpod-conmon-a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e.scope: Deactivated successfully.
Nov 25 19:01:05 compute-0 sudo[203215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crsibppcblpiaxkdfklbnaquhhejwjye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097264.625074-1451-13754661203209/AnsiballZ_podman_container_exec.py'
Nov 25 19:01:05 compute-0 sudo[203215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:05 compute-0 python3.9[203217]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 25 19:01:05 compute-0 systemd[1]: Started libpod-conmon-a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e.scope.
Nov 25 19:01:05 compute-0 podman[203218]: 2025-11-25 19:01:05.378352221 +0000 UTC m=+0.104496917 container exec a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 19:01:05 compute-0 podman[203218]: 2025-11-25 19:01:05.408690407 +0000 UTC m=+0.134835063 container exec_died a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=)
Nov 25 19:01:05 compute-0 systemd[1]: libpod-conmon-a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e.scope: Deactivated successfully.
Nov 25 19:01:05 compute-0 sudo[203215]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:06 compute-0 sudo[203399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jifgdzygufvdansolldhjgufiduncatq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097265.7011604-1459-184661360049801/AnsiballZ_file.py'
Nov 25 19:01:06 compute-0 sudo[203399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:06 compute-0 python3.9[203401]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:06 compute-0 sudo[203399]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:07 compute-0 podman[203426]: 2025-11-25 19:01:07.214705046 +0000 UTC m=+0.128564086 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 19:01:09 compute-0 podman[203452]: 2025-11-25 19:01:09.190082466 +0000 UTC m=+0.113923554 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125)
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.021 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.022 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.541 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.541 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.541 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.542 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.542 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.542 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.542 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:01:11 compute-0 nova_compute[187212]: 2025-11-25 19:01:11.542 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.058 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.059 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.059 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.060 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.280 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.282 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.309 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.310 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6077MB free_disk=73.02980041503906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.311 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:01:12 compute-0 nova_compute[187212]: 2025-11-25 19:01:12.311 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:01:13 compute-0 nova_compute[187212]: 2025-11-25 19:01:13.366 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:01:13 compute-0 nova_compute[187212]: 2025-11-25 19:01:13.367 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:01:12 up 53 min,  0 user,  load average: 0.70, 0.82, 0.68\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:01:13 compute-0 nova_compute[187212]: 2025-11-25 19:01:13.443 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:01:13 compute-0 nova_compute[187212]: 2025-11-25 19:01:13.959 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:01:14 compute-0 nova_compute[187212]: 2025-11-25 19:01:14.469 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:01:14 compute-0 nova_compute[187212]: 2025-11-25 19:01:14.469 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.158s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:01:15 compute-0 podman[203473]: 2025-11-25 19:01:15.175795324 +0000 UTC m=+0.092842901 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350)
Nov 25 19:01:18 compute-0 podman[203494]: 2025-11-25 19:01:18.171240435 +0000 UTC m=+0.090976070 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:01:26 compute-0 sudo[203639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhuepecwgbxcysjxcxnzxefdfuqleyrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097286.3272986-1634-167586078601846/AnsiballZ_file.py'
Nov 25 19:01:26 compute-0 sudo[203639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:26 compute-0 python3.9[203641]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:26 compute-0 sudo[203639]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:27 compute-0 sudo[203791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibbbyooegflzqooekroxmuykmbhdlllu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097287.2352989-1650-123763933739201/AnsiballZ_stat.py'
Nov 25 19:01:27 compute-0 sudo[203791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:27 compute-0 python3.9[203793]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:01:27 compute-0 sudo[203791]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:28 compute-0 sudo[203914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sougvpynsibnkimfmihsdjusskiiataq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097287.2352989-1650-123763933739201/AnsiballZ_copy.py'
Nov 25 19:01:28 compute-0 sudo[203914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:28 compute-0 python3.9[203916]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097287.2352989-1650-123763933739201/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:28 compute-0 sudo[203914]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:29 compute-0 sudo[204066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzukijvszhoforyyzaotppmhoyimleiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097288.8525841-1682-197201312888293/AnsiballZ_file.py'
Nov 25 19:01:29 compute-0 sudo[204066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:29 compute-0 python3.9[204068]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:29 compute-0 sudo[204066]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:30 compute-0 sudo[204231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtjtszxnrbgjgztewmhgoejyatuvkciy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097289.728203-1698-93601411339435/AnsiballZ_stat.py'
Nov 25 19:01:30 compute-0 sudo[204231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:30 compute-0 podman[204192]: 2025-11-25 19:01:30.14947667 +0000 UTC m=+0.109654730 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:01:30 compute-0 python3.9[204244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:01:30 compute-0 sudo[204231]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:30 compute-0 sudo[204320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seqrnzvfpmclyzritsxcjymnsavtjzds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097289.728203-1698-93601411339435/AnsiballZ_file.py'
Nov 25 19:01:30 compute-0 sudo[204320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:30 compute-0 python3.9[204322]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:30 compute-0 sudo[204320]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:01:31.056 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:01:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:01:31.056 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:01:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:01:31.056 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:01:31 compute-0 sudo[204473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urstwewyyhqrwnngezvmzkuozhfnilwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097291.0906413-1722-153054720743106/AnsiballZ_stat.py'
Nov 25 19:01:31 compute-0 sudo[204473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:31 compute-0 python3.9[204475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:01:31 compute-0 sudo[204473]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:32 compute-0 sudo[204551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnkitaaykpfrgggjcwacqdsqnwtzmpnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097291.0906413-1722-153054720743106/AnsiballZ_file.py'
Nov 25 19:01:32 compute-0 sudo[204551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:32 compute-0 python3.9[204553]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ze53p1tc recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:32 compute-0 sudo[204551]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:32 compute-0 sudo[204703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpacfsyjukjjusjdcbzbqziwajafccwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097292.4755745-1746-261357690113218/AnsiballZ_stat.py'
Nov 25 19:01:32 compute-0 sudo[204703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:33 compute-0 python3.9[204705]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:01:33 compute-0 sudo[204703]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:33 compute-0 sudo[204781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvzhgsxobcbhffvgqefqttzkbnpwhfxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097292.4755745-1746-261357690113218/AnsiballZ_file.py'
Nov 25 19:01:33 compute-0 sudo[204781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:33 compute-0 python3.9[204783]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:33 compute-0 sudo[204781]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:34 compute-0 sudo[204933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcszdggrbyoyslnnjgyqwoeaslgkzfza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097293.9004316-1772-179070985206067/AnsiballZ_command.py'
Nov 25 19:01:34 compute-0 sudo[204933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:34 compute-0 python3.9[204935]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 19:01:34 compute-0 sudo[204933]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:35 compute-0 sudo[205087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrznlkrvttizryzcubtkulhiuipwjtkg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764097294.7308724-1788-1083001924645/AnsiballZ_edpm_nftables_from_files.py'
Nov 25 19:01:35 compute-0 sudo[205087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:35 compute-0 python3[205089]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 25 19:01:35 compute-0 sudo[205087]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:36 compute-0 sudo[205239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyulsccdefnwofeukhhedwjxyotgivkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097295.7420282-1804-229854622286077/AnsiballZ_stat.py'
Nov 25 19:01:36 compute-0 sudo[205239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:36 compute-0 python3.9[205241]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:01:36 compute-0 sudo[205239]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:36 compute-0 sudo[205317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upkhgdrecaxnhmeapvemqxivsyupegjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097295.7420282-1804-229854622286077/AnsiballZ_file.py'
Nov 25 19:01:36 compute-0 sudo[205317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:36 compute-0 python3.9[205319]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:36 compute-0 sudo[205317]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:37 compute-0 sudo[205485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqoecefypitfthsegdecfviqzhqxqzrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097297.195398-1828-12780770133195/AnsiballZ_stat.py'
Nov 25 19:01:37 compute-0 sudo[205485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:37 compute-0 podman[205443]: 2025-11-25 19:01:37.71334167 +0000 UTC m=+0.096557061 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 19:01:37 compute-0 python3.9[205491]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:01:37 compute-0 sudo[205485]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:38 compute-0 sudo[205573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfljyqqnjoxylifvdxbbyoziqfnflyen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097297.195398-1828-12780770133195/AnsiballZ_file.py'
Nov 25 19:01:38 compute-0 sudo[205573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:38 compute-0 python3.9[205575]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:38 compute-0 sudo[205573]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:39 compute-0 sudo[205725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgdevlcfxwbygyeqhnmbdiobbxiqlapd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097298.6342633-1852-1589587723634/AnsiballZ_stat.py'
Nov 25 19:01:39 compute-0 sudo[205725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:39 compute-0 python3.9[205727]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:01:39 compute-0 sudo[205725]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:39 compute-0 podman[205777]: 2025-11-25 19:01:39.564009399 +0000 UTC m=+0.061344341 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 25 19:01:39 compute-0 sudo[205823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssbeqdpytunznstyhayxpckhnkzkcopv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097298.6342633-1852-1589587723634/AnsiballZ_file.py'
Nov 25 19:01:39 compute-0 sudo[205823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:39 compute-0 python3.9[205825]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:39 compute-0 sudo[205823]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:40 compute-0 sudo[205975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrfhuljiaydlcavclxqxucjstvyntihk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097300.0258145-1876-34619981214651/AnsiballZ_stat.py'
Nov 25 19:01:40 compute-0 sudo[205975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:40 compute-0 python3.9[205977]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:01:40 compute-0 sudo[205975]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:41 compute-0 sudo[206053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eevccprcjauinfuzimbhbvifcglvjfic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097300.0258145-1876-34619981214651/AnsiballZ_file.py'
Nov 25 19:01:41 compute-0 sudo[206053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:41 compute-0 python3.9[206055]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:41 compute-0 sudo[206053]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:41 compute-0 sudo[206205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvhnmwoejeepgicynmylyxlhdsbulwsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097301.4910984-1900-115847506440032/AnsiballZ_stat.py'
Nov 25 19:01:41 compute-0 sudo[206205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:42 compute-0 python3.9[206207]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 25 19:01:42 compute-0 sudo[206205]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:42 compute-0 sudo[206330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erzycpsvqegjgbuzonskxvupogpivmaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097301.4910984-1900-115847506440032/AnsiballZ_copy.py'
Nov 25 19:01:42 compute-0 sudo[206330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:42 compute-0 python3.9[206332]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764097301.4910984-1900-115847506440032/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:42 compute-0 sudo[206330]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:43 compute-0 sudo[206482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxddhohykeqvkgpuvfzbpozpifrxteji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097303.1144502-1930-74962562617145/AnsiballZ_file.py'
Nov 25 19:01:43 compute-0 sudo[206482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:43 compute-0 python3.9[206484]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:43 compute-0 sudo[206482]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:44 compute-0 sudo[206634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nismmgtwrdqldoykhrinorfatxxukjdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097303.9809847-1946-162578231941449/AnsiballZ_command.py'
Nov 25 19:01:44 compute-0 sudo[206634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:44 compute-0 python3.9[206636]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 19:01:44 compute-0 sudo[206634]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:45 compute-0 sudo[206802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caftjhibtayjntgfgwbeynblkajijido ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097304.813274-1962-249792960009359/AnsiballZ_blockinfile.py'
Nov 25 19:01:45 compute-0 sudo[206802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:45 compute-0 podman[206763]: 2025-11-25 19:01:45.297183972 +0000 UTC m=+0.066627461 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 19:01:45 compute-0 python3.9[206813]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:45 compute-0 sudo[206802]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:46 compute-0 sudo[206963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eguqklipqthyhenepykkemguhlvdbroe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097306.0337417-1980-130409096186559/AnsiballZ_command.py'
Nov 25 19:01:46 compute-0 sudo[206963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:46 compute-0 python3.9[206965]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 19:01:46 compute-0 sudo[206963]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:47 compute-0 sudo[207116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpldddxvcnriducouuytmvrisjqtgscl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097306.9788516-1996-108271786149860/AnsiballZ_stat.py'
Nov 25 19:01:47 compute-0 sudo[207116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:47 compute-0 python3.9[207118]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 25 19:01:47 compute-0 sudo[207116]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:48 compute-0 sudo[207280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjsaelkjoxwsyvgdezktwikozkuwvggg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097307.879504-2012-19358497230454/AnsiballZ_command.py'
Nov 25 19:01:48 compute-0 sudo[207280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:48 compute-0 podman[207244]: 2025-11-25 19:01:48.315619853 +0000 UTC m=+0.089255732 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 19:01:48 compute-0 python3.9[207289]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 19:01:48 compute-0 sudo[207280]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:49 compute-0 sudo[207444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwkbdinoaiqdjdkcztwjbfsnffwxtjyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764097308.7798476-2028-195400162349193/AnsiballZ_file.py'
Nov 25 19:01:49 compute-0 sudo[207444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:49 compute-0 python3.9[207446]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:49 compute-0 sudo[207444]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:49 compute-0 sshd-session[187526]: Connection closed by 192.168.122.30 port 41762
Nov 25 19:01:49 compute-0 sshd-session[187523]: pam_unix(sshd:session): session closed for user zuul
Nov 25 19:01:49 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Nov 25 19:01:49 compute-0 systemd[1]: session-26.scope: Consumed 1min 41.340s CPU time.
Nov 25 19:01:49 compute-0 systemd-logind[820]: Session 26 logged out. Waiting for processes to exit.
Nov 25 19:01:49 compute-0 systemd-logind[820]: Removed session 26.
Nov 25 19:01:55 compute-0 sshd-session[207471]: Accepted publickey for zuul from 38.102.83.130 port 59074 ssh2: RSA SHA256:jYBcT1icRquFQigknn/K3KSao1vKrqzJ1yq0uAHq9V0
Nov 25 19:01:55 compute-0 systemd-logind[820]: New session 27 of user zuul.
Nov 25 19:01:55 compute-0 systemd[1]: Started Session 27 of User zuul.
Nov 25 19:01:55 compute-0 sshd-session[207471]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 19:01:55 compute-0 sudo[207498]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uebkaglskmixakiqdzsnouububkjpjtf ; /usr/bin/python3'
Nov 25 19:01:55 compute-0 sudo[207498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:56 compute-0 python3[207500]: ansible-ansible.legacy.dnf Invoked with name=['nfs-utils', 'iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Nov 25 19:01:57 compute-0 sudo[207498]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:57 compute-0 sudo[207525]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdxkxuowynjzrlmqkuzneagivozpisdi ; /usr/bin/python3'
Nov 25 19:01:57 compute-0 sudo[207525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:57 compute-0 python3[207527]: ansible-community.general.ini_file Invoked with path=/etc/nfs.conf section=nfsd option=vers3 value=n backup=True mode=0644 state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:01:57 compute-0 sudo[207525]: pam_unix(sudo:session): session closed for user root
Nov 25 19:01:58 compute-0 sudo[207553]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gllidxbfpnkgtexmyfrqgbdlgypcpqot ; /usr/bin/python3'
Nov 25 19:01:58 compute-0 sudo[207553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:01:58 compute-0 python3[207555]: ansible-ansible.builtin.systemd_service Invoked with name=rpc-statd.service masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Nov 25 19:01:59 compute-0 systemd[1]: Reloading.
Nov 25 19:01:59 compute-0 podman[197585]: time="2025-11-25T19:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:01:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:01:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2563 "" "Go-http-client/1.1"
Nov 25 19:01:59 compute-0 systemd-rc-local-generator[207588]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 19:01:59 compute-0 systemd-sysv-generator[207591]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 19:02:00 compute-0 sudo[207553]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:00 compute-0 sudo[207618]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezfzaeueyqwcacktdupzjfwrqshsfoyj ; /usr/bin/python3'
Nov 25 19:02:00 compute-0 sudo[207618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:00 compute-0 podman[207620]: 2025-11-25 19:02:00.369549709 +0000 UTC m=+0.102593137 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:02:00 compute-0 python3[207621]: ansible-ansible.builtin.systemd_service Invoked with name=rpcbind.service masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Nov 25 19:02:00 compute-0 systemd[1]: Reloading.
Nov 25 19:02:00 compute-0 systemd-rc-local-generator[207670]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 19:02:00 compute-0 systemd-sysv-generator[207676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 19:02:00 compute-0 systemd[1]: rpcbind.service: Current command vanished from the unit file, execution of the command list won't be resumed.
Nov 25 19:02:00 compute-0 sudo[207618]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:01 compute-0 sudo[207706]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmrvzwvgabhwfhmbhirpnbekvcdorhys ; /usr/bin/python3'
Nov 25 19:02:01 compute-0 sudo[207706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:01 compute-0 python3[207708]: ansible-ansible.builtin.systemd_service Invoked with name=rpcbind.socket masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Nov 25 19:02:01 compute-0 systemd[1]: Reloading.
Nov 25 19:02:01 compute-0 openstack_network_exporter[199731]: ERROR   19:02:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:02:01 compute-0 openstack_network_exporter[199731]: ERROR   19:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:02:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:02:01 compute-0 openstack_network_exporter[199731]: ERROR   19:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:02:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:02:01 compute-0 openstack_network_exporter[199731]: ERROR   19:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:02:01 compute-0 openstack_network_exporter[199731]: ERROR   19:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:02:01 compute-0 systemd-rc-local-generator[207741]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 19:02:01 compute-0 systemd-sysv-generator[207744]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 19:02:01 compute-0 systemd[1]: rpcbind.socket: Socket unit configuration has changed while unit has been running, no open socket file descriptor left. The socket unit is not functional until restarted.
Nov 25 19:02:01 compute-0 sudo[207706]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:01 compute-0 sudo[207772]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilutpwmmmvhsizhpttpqyjvfmfpbzwnd ; /usr/bin/python3'
Nov 25 19:02:01 compute-0 sudo[207772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:02 compute-0 python3[207774]: ansible-ansible.builtin.file Invoked with path=/data/cinder_backend_1 state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:02:02 compute-0 sudo[207772]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:02 compute-0 sudo[207798]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkjarfpkknfdktnmbblipywzqqiyfauj ; /usr/bin/python3'
Nov 25 19:02:02 compute-0 sudo[207798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:02 compute-0 python3[207800]: ansible-ansible.builtin.file Invoked with path=/data/cinder_backend_2 state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:02:02 compute-0 sudo[207798]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:02 compute-0 sudo[207824]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xggedikzznpcclztnitczwrjwpcbszlj ; /usr/bin/python3'
Nov 25 19:02:02 compute-0 sudo[207824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:02 compute-0 python3[207826]: ansible-ansible.builtin.file Invoked with path=/data/cinderbackup state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:02:02 compute-0 sudo[207824]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:04 compute-0 sudo[207904]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyjhouhmzfqidzkvtotyxkmbjekfdldc ; /usr/bin/python3'
Nov 25 19:02:04 compute-0 sudo[207904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:04 compute-0 python3[207906]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/nfs-server.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 25 19:02:04 compute-0 sudo[207904]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:05 compute-0 sudo[207977]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnhgftzbwsqjfkbasbtectuqckmeudxu ; /usr/bin/python3'
Nov 25 19:02:05 compute-0 sudo[207977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:05 compute-0 python3[207979]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/nfs-server.nft mode=0666 src=/home/zuul/.ansible/tmp/ansible-tmp-1764097324.388927-36675-153047161146967/source _original_basename=tmp7lq9fulr follow=False checksum=f91e6a2e98f3d3c48705976f5b33f9e81e7cf7f4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:02:05 compute-0 sudo[207977]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:05 compute-0 sudo[208027]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njviqevnpgjzypwrefrjfmbkjdsrwdux ; /usr/bin/python3'
Nov 25 19:02:05 compute-0 sudo[208027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:05 compute-0 python3[208029]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/sysconfig/nftables.conf line=include "/etc/nftables/nfs-server.nft" insertafter=EOF state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:02:05 compute-0 sudo[208027]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:06 compute-0 sudo[208053]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqhcrcdvfpthhddzyljcirefisgjfcmz ; /usr/bin/python3'
Nov 25 19:02:06 compute-0 sudo[208053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:06 compute-0 python3[208055]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 25 19:02:06 compute-0 systemd[1]: Stopping Netfilter Tables...
Nov 25 19:02:06 compute-0 systemd[1]: nftables.service: Deactivated successfully.
Nov 25 19:02:06 compute-0 systemd[1]: Stopped Netfilter Tables.
Nov 25 19:02:06 compute-0 systemd[1]: Starting Netfilter Tables...
Nov 25 19:02:06 compute-0 systemd[1]: Finished Netfilter Tables.
Nov 25 19:02:06 compute-0 sudo[208053]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:06 compute-0 sudo[208084]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvzumhwriinxxkkilempojqbditqqsps ; /usr/bin/python3'
Nov 25 19:02:06 compute-0 sudo[208084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:07 compute-0 python3[208086]: ansible-community.general.ini_file Invoked with path=/etc/nfs.conf section=nfsd option=host value=172.18.0.101 backup=True mode=0644 state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:02:07 compute-0 sudo[208084]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:07 compute-0 sudo[208112]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgixcjsewnclalkamovyqomxftxlltxw ; /usr/bin/python3'
Nov 25 19:02:07 compute-0 sudo[208112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:07 compute-0 python3[208114]: ansible-ansible.builtin.systemd Invoked with name=nfs-server state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 25 19:02:07 compute-0 systemd[1]: Reloading.
Nov 25 19:02:07 compute-0 systemd-rc-local-generator[208143]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 25 19:02:07 compute-0 systemd-sysv-generator[208146]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 25 19:02:07 compute-0 systemd[1]: rpcbind.socket: Socket unit configuration has changed while unit has been running, no open socket file descriptor left. The socket unit is not functional until restarted.
Nov 25 19:02:07 compute-0 systemd[1]: Mounting NFSD configuration filesystem...
Nov 25 19:02:07 compute-0 systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 25 19:02:07 compute-0 systemd[1]: Starting NFSv4 ID-name mapping service...
Nov 25 19:02:07 compute-0 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 25 19:02:07 compute-0 rpc.idmapd[208161]: Setting log level to 0
Nov 25 19:02:07 compute-0 systemd[1]: Started NFSv4 ID-name mapping service.
Nov 25 19:02:07 compute-0 podman[208152]: 2025-11-25 19:02:07.99274046 +0000 UTC m=+0.125759113 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 19:02:08 compute-0 systemd[1]: Mounted NFSD configuration filesystem.
Nov 25 19:02:08 compute-0 systemd[1]: Starting NFS Mount Daemon...
Nov 25 19:02:08 compute-0 systemd[1]: Starting NFSv4 Client Tracking Daemon...
Nov 25 19:02:08 compute-0 systemd[1]: Started NFSv4 Client Tracking Daemon.
Nov 25 19:02:08 compute-0 rpc.mountd[208188]: Version 2.5.4 starting
Nov 25 19:02:08 compute-0 systemd[1]: Started NFS Mount Daemon.
Nov 25 19:02:08 compute-0 systemd[1]: Starting NFS server and services...
Nov 25 19:02:08 compute-0 kernel: RPC: Registered rdma transport module.
Nov 25 19:02:08 compute-0 kernel: RPC: Registered rdma backchannel transport module.
Nov 25 19:02:08 compute-0 kernel: NFSD: Using nfsdcld client tracking operations.
Nov 25 19:02:08 compute-0 kernel: NFSD: no clients to reclaim, skipping NFSv4 grace period (net f0000000)
Nov 25 19:02:08 compute-0 systemd[1]: Reloading GSSAPI Proxy Daemon...
Nov 25 19:02:08 compute-0 systemd[1]: Reloaded GSSAPI Proxy Daemon.
Nov 25 19:02:08 compute-0 systemd[1]: Finished NFS server and services.
Nov 25 19:02:08 compute-0 sudo[208112]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:08 compute-0 sudo[208229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsfgftazdiixjushkzkpnkoenememhgh ; /usr/bin/python3'
Nov 25 19:02:08 compute-0 sudo[208229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:08 compute-0 python3[208231]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinder_backend_1 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:02:08 compute-0 sudo[208229]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:09 compute-0 sudo[208255]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecztltfuxrmihkcrkczzbsuovprtvkhw ; /usr/bin/python3'
Nov 25 19:02:09 compute-0 sudo[208255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:09 compute-0 python3[208257]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinder_backend_2 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:02:09 compute-0 sudo[208255]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:09 compute-0 sudo[208281]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwlmrvoidwxlhrosalkzlknoqopnupef ; /usr/bin/python3'
Nov 25 19:02:09 compute-0 sudo[208281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:09 compute-0 python3[208283]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinderbackup 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 25 19:02:09 compute-0 sudo[208281]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:09 compute-0 sudo[208307]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rozeasccawbgnavpbbpywldjqtxgxwmo ; /usr/bin/python3'
Nov 25 19:02:09 compute-0 sudo[208307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 19:02:09 compute-0 podman[208309]: 2025-11-25 19:02:09.818005328 +0000 UTC m=+0.099458253 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:02:09 compute-0 python3[208310]: ansible-ansible.legacy.command Invoked with _raw_params=exportfs -a _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 25 19:02:09 compute-0 sudo[208307]: pam_unix(sudo:session): session closed for user root
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.471 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.472 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.472 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.472 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.472 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.473 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.473 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.473 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.473 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.986 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.987 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.987 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:02:14 compute-0 nova_compute[187212]: 2025-11-25 19:02:14.988 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:02:15 compute-0 nova_compute[187212]: 2025-11-25 19:02:15.214 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:02:15 compute-0 nova_compute[187212]: 2025-11-25 19:02:15.215 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:02:15 compute-0 nova_compute[187212]: 2025-11-25 19:02:15.250 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:02:15 compute-0 nova_compute[187212]: 2025-11-25 19:02:15.251 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6076MB free_disk=73.02734756469727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:02:15 compute-0 nova_compute[187212]: 2025-11-25 19:02:15.251 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:02:15 compute-0 nova_compute[187212]: 2025-11-25 19:02:15.252 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:02:16 compute-0 podman[208331]: 2025-11-25 19:02:16.166204011 +0000 UTC m=+0.084509797 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 19:02:16 compute-0 nova_compute[187212]: 2025-11-25 19:02:16.322 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:02:16 compute-0 nova_compute[187212]: 2025-11-25 19:02:16.323 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:02:15 up 54 min,  0 user,  load average: 0.68, 0.79, 0.68\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:02:16 compute-0 nova_compute[187212]: 2025-11-25 19:02:16.359 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:02:16 compute-0 nova_compute[187212]: 2025-11-25 19:02:16.868 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:02:17 compute-0 nova_compute[187212]: 2025-11-25 19:02:17.378 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:02:17 compute-0 nova_compute[187212]: 2025-11-25 19:02:17.379 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:02:19 compute-0 podman[208355]: 2025-11-25 19:02:19.172646643 +0000 UTC m=+0.093720671 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd)
Nov 25 19:02:29 compute-0 podman[197585]: time="2025-11-25T19:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:02:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:02:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 25 19:02:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:02:31.057 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:02:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:02:31.058 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:02:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:02:31.058 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:02:31 compute-0 podman[208376]: 2025-11-25 19:02:31.161928832 +0000 UTC m=+0.082254746 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:02:31 compute-0 openstack_network_exporter[199731]: ERROR   19:02:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:02:31 compute-0 openstack_network_exporter[199731]: ERROR   19:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:02:31 compute-0 openstack_network_exporter[199731]: ERROR   19:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:02:31 compute-0 openstack_network_exporter[199731]: ERROR   19:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:02:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:02:31 compute-0 openstack_network_exporter[199731]: ERROR   19:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:02:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:02:38 compute-0 podman[208411]: 2025-11-25 19:02:38.207331281 +0000 UTC m=+0.130463988 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 25 19:02:40 compute-0 podman[208437]: 2025-11-25 19:02:40.150672096 +0000 UTC m=+0.076409522 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Nov 25 19:02:47 compute-0 podman[208456]: 2025-11-25 19:02:47.171821947 +0000 UTC m=+0.087347630 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm)
Nov 25 19:02:50 compute-0 podman[208478]: 2025-11-25 19:02:50.161836288 +0000 UTC m=+0.085037909 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 19:02:59 compute-0 podman[197585]: time="2025-11-25T19:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:02:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:02:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Nov 25 19:03:01 compute-0 openstack_network_exporter[199731]: ERROR   19:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:03:01 compute-0 openstack_network_exporter[199731]: ERROR   19:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:03:01 compute-0 openstack_network_exporter[199731]: ERROR   19:03:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:03:01 compute-0 openstack_network_exporter[199731]: ERROR   19:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:03:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:03:01 compute-0 openstack_network_exporter[199731]: ERROR   19:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:03:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:03:02 compute-0 podman[208498]: 2025-11-25 19:03:02.157703642 +0000 UTC m=+0.078725073 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:03:09 compute-0 podman[208524]: 2025-11-25 19:03:09.21666049 +0000 UTC m=+0.128807227 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Nov 25 19:03:11 compute-0 podman[208550]: 2025-11-25 19:03:11.191786669 +0000 UTC m=+0.106882346 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.076 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.077 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.589 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.589 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.590 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.590 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.590 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.591 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.591 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:03:15 compute-0 nova_compute[187212]: 2025-11-25 19:03:15.591 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.111 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.112 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.112 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.113 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.361 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.364 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.389 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.391 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6143MB free_disk=73.03132629394531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.391 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:03:16 compute-0 nova_compute[187212]: 2025-11-25 19:03:16.392 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:03:17 compute-0 nova_compute[187212]: 2025-11-25 19:03:17.460 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:03:17 compute-0 nova_compute[187212]: 2025-11-25 19:03:17.461 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:03:16 up 55 min,  0 user,  load average: 0.25, 0.64, 0.64\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:03:17 compute-0 nova_compute[187212]: 2025-11-25 19:03:17.486 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:03:17 compute-0 nova_compute[187212]: 2025-11-25 19:03:17.994 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:03:18 compute-0 podman[208570]: 2025-11-25 19:03:18.176760692 +0000 UTC m=+0.103937439 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 19:03:18 compute-0 nova_compute[187212]: 2025-11-25 19:03:18.506 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:03:18 compute-0 nova_compute[187212]: 2025-11-25 19:03:18.506 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:03:21 compute-0 podman[208592]: 2025-11-25 19:03:21.189535795 +0000 UTC m=+0.095236439 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:03:29 compute-0 podman[197585]: time="2025-11-25T19:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:03:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:03:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Nov 25 19:03:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:03:31.059 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:03:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:03:31.060 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:03:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:03:31.060 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:03:31 compute-0 openstack_network_exporter[199731]: ERROR   19:03:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:03:31 compute-0 openstack_network_exporter[199731]: ERROR   19:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:03:31 compute-0 openstack_network_exporter[199731]: ERROR   19:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:03:31 compute-0 openstack_network_exporter[199731]: ERROR   19:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:03:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:03:31 compute-0 openstack_network_exporter[199731]: ERROR   19:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:03:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:03:33 compute-0 podman[208614]: 2025-11-25 19:03:33.166811858 +0000 UTC m=+0.080726816 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:03:40 compute-0 podman[208638]: 2025-11-25 19:03:40.214019906 +0000 UTC m=+0.132359591 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Nov 25 19:03:42 compute-0 podman[208665]: 2025-11-25 19:03:42.155715593 +0000 UTC m=+0.076608987 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Nov 25 19:03:49 compute-0 podman[208684]: 2025-11-25 19:03:49.337406328 +0000 UTC m=+0.096145794 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350)
Nov 25 19:03:52 compute-0 podman[208705]: 2025-11-25 19:03:52.154776037 +0000 UTC m=+0.070392524 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:03:59 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:03:59.553 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:03:59 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:03:59.554 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:03:59 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:03:59.558 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:03:59 compute-0 podman[197585]: time="2025-11-25T19:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:03:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:03:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Nov 25 19:04:01 compute-0 openstack_network_exporter[199731]: ERROR   19:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:04:01 compute-0 openstack_network_exporter[199731]: ERROR   19:04:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:04:01 compute-0 openstack_network_exporter[199731]: ERROR   19:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:04:01 compute-0 openstack_network_exporter[199731]: ERROR   19:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:04:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:04:01 compute-0 openstack_network_exporter[199731]: ERROR   19:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:04:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:04:04 compute-0 podman[208727]: 2025-11-25 19:04:04.162300318 +0000 UTC m=+0.083142981 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:04:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:04.386 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:60:ca 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-7dd443a0-2954-4507-bcdb-63071a943f4a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7dd443a0-2954-4507-bcdb-63071a943f4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8452218b0aa04a20a3969d637355f8c1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f34b440-e8f3-4862-980c-9401f635efb2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5e263b0d-c812-4443-b275-6064c6f0743c) old=Port_Binding(mac=['fa:16:3e:68:60:ca'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7dd443a0-2954-4507-bcdb-63071a943f4a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7dd443a0-2954-4507-bcdb-63071a943f4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8452218b0aa04a20a3969d637355f8c1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:04:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:04.388 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5e263b0d-c812-4443-b275-6064c6f0743c in datapath 7dd443a0-2954-4507-bcdb-63071a943f4a updated
Nov 25 19:04:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:04.390 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7dd443a0-2954-4507-bcdb-63071a943f4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:04:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:04.391 104356 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpv206g5e3/privsep.sock']
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:05.056 104356 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:05.056 104356 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpv206g5e3/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:04.942 208756 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:04.948 208756 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:04.952 208756 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:04.953 208756 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208756
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:05.058 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9c5f72-6cf6-45b6-b954-f27bf8b5bcdb]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:05.501 208756 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:05.501 208756 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:05.501 208756 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:05.931 208756 INFO oslo_service.backend [-] Loading backend: eventlet
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:05.936 208756 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Nov 25 19:04:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:05.970 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[90f9766e-7616-42fa-8a25-34da8878b11b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:04:08 compute-0 nova_compute[187212]: 2025-11-25 19:04:08.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:08 compute-0 nova_compute[187212]: 2025-11-25 19:04:08.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:04:08 compute-0 nova_compute[187212]: 2025-11-25 19:04:08.686 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:04:08 compute-0 nova_compute[187212]: 2025-11-25 19:04:08.688 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:08 compute-0 nova_compute[187212]: 2025-11-25 19:04:08.688 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:04:09 compute-0 nova_compute[187212]: 2025-11-25 19:04:09.197 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:10 compute-0 nova_compute[187212]: 2025-11-25 19:04:10.703 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:10 compute-0 nova_compute[187212]: 2025-11-25 19:04:10.704 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:10 compute-0 nova_compute[187212]: 2025-11-25 19:04:10.704 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:10 compute-0 nova_compute[187212]: 2025-11-25 19:04:10.704 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:04:10 compute-0 nova_compute[187212]: 2025-11-25 19:04:10.705 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:11 compute-0 podman[208761]: 2025-11-25 19:04:11.191707104 +0000 UTC m=+0.113442623 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.225 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.226 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.226 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.226 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.439 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.440 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.467 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.468 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6035MB free_disk=73.03152084350586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.468 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:04:11 compute-0 nova_compute[187212]: 2025-11-25 19:04:11.469 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:04:12 compute-0 nova_compute[187212]: 2025-11-25 19:04:12.524 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:04:12 compute-0 nova_compute[187212]: 2025-11-25 19:04:12.524 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:04:11 up 56 min,  0 user,  load average: 0.10, 0.53, 0.60\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:04:12 compute-0 nova_compute[187212]: 2025-11-25 19:04:12.551 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:04:13 compute-0 nova_compute[187212]: 2025-11-25 19:04:13.067 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:04:13 compute-0 podman[208788]: 2025-11-25 19:04:13.167070575 +0000 UTC m=+0.087035234 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 19:04:13 compute-0 nova_compute[187212]: 2025-11-25 19:04:13.579 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:04:13 compute-0 nova_compute[187212]: 2025-11-25 19:04:13.579 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:04:14 compute-0 nova_compute[187212]: 2025-11-25 19:04:14.049 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:14 compute-0 nova_compute[187212]: 2025-11-25 19:04:14.049 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:14 compute-0 nova_compute[187212]: 2025-11-25 19:04:14.050 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:14 compute-0 nova_compute[187212]: 2025-11-25 19:04:14.050 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:04:20 compute-0 podman[208807]: 2025-11-25 19:04:20.166035235 +0000 UTC m=+0.090085764 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 19:04:23 compute-0 podman[208829]: 2025-11-25 19:04:23.17377647 +0000 UTC m=+0.092710704 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 19:04:29 compute-0 podman[197585]: time="2025-11-25T19:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:04:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:04:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2574 "" "Go-http-client/1.1"
Nov 25 19:04:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:31.062 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:04:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:31.063 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:04:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:04:31.063 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:04:31 compute-0 openstack_network_exporter[199731]: ERROR   19:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:04:31 compute-0 openstack_network_exporter[199731]: ERROR   19:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:04:31 compute-0 openstack_network_exporter[199731]: ERROR   19:04:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:04:31 compute-0 openstack_network_exporter[199731]: ERROR   19:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:04:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:04:31 compute-0 openstack_network_exporter[199731]: ERROR   19:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:04:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:04:35 compute-0 podman[208852]: 2025-11-25 19:04:35.147687102 +0000 UTC m=+0.073294320 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:04:42 compute-0 podman[208876]: 2025-11-25 19:04:42.241918983 +0000 UTC m=+0.156046580 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 19:04:44 compute-0 podman[208902]: 2025-11-25 19:04:44.155509251 +0000 UTC m=+0.076453855 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:04:51 compute-0 podman[208922]: 2025-11-25 19:04:51.1536789 +0000 UTC m=+0.073654750 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 25 19:04:54 compute-0 podman[208944]: 2025-11-25 19:04:54.165850503 +0000 UTC m=+0.086503046 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251125)
Nov 25 19:04:59 compute-0 podman[197585]: time="2025-11-25T19:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:04:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:04:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 25 19:05:01 compute-0 openstack_network_exporter[199731]: ERROR   19:05:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:05:01 compute-0 openstack_network_exporter[199731]: ERROR   19:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:05:01 compute-0 openstack_network_exporter[199731]: ERROR   19:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:05:01 compute-0 openstack_network_exporter[199731]: ERROR   19:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:05:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:05:01 compute-0 openstack_network_exporter[199731]: ERROR   19:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:05:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:05:06 compute-0 podman[208964]: 2025-11-25 19:05:06.158233107 +0000 UTC m=+0.082132158 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:05:09 compute-0 nova_compute[187212]: 2025-11-25 19:05:09.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.175 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.695 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.695 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.923 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.925 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.961 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.963 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6051MB free_disk=73.03153991699219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.963 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:05:10 compute-0 nova_compute[187212]: 2025-11-25 19:05:10.964 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:05:12 compute-0 nova_compute[187212]: 2025-11-25 19:05:12.097 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:05:12 compute-0 nova_compute[187212]: 2025-11-25 19:05:12.097 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:05:10 up 57 min,  0 user,  load average: 0.10, 0.45, 0.56\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:05:12 compute-0 nova_compute[187212]: 2025-11-25 19:05:12.154 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:05:12 compute-0 nova_compute[187212]: 2025-11-25 19:05:12.217 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:05:12 compute-0 nova_compute[187212]: 2025-11-25 19:05:12.217 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:05:12 compute-0 nova_compute[187212]: 2025-11-25 19:05:12.243 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:05:12 compute-0 nova_compute[187212]: 2025-11-25 19:05:12.265 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:05:12 compute-0 nova_compute[187212]: 2025-11-25 19:05:12.296 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:05:12 compute-0 nova_compute[187212]: 2025-11-25 19:05:12.805 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:05:13 compute-0 podman[208990]: 2025-11-25 19:05:13.222817468 +0000 UTC m=+0.136663038 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:05:13 compute-0 nova_compute[187212]: 2025-11-25 19:05:13.313 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:05:13 compute-0 nova_compute[187212]: 2025-11-25 19:05:13.314 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.350s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:05:15 compute-0 podman[209016]: 2025-11-25 19:05:15.184190407 +0000 UTC m=+0.095734352 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 25 19:05:15 compute-0 nova_compute[187212]: 2025-11-25 19:05:15.309 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:05:15 compute-0 nova_compute[187212]: 2025-11-25 19:05:15.310 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:05:15 compute-0 nova_compute[187212]: 2025-11-25 19:05:15.819 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:05:15 compute-0 nova_compute[187212]: 2025-11-25 19:05:15.819 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:05:15 compute-0 nova_compute[187212]: 2025-11-25 19:05:15.819 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:05:22 compute-0 podman[209037]: 2025-11-25 19:05:22.217642413 +0000 UTC m=+0.133719969 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Nov 25 19:05:25 compute-0 podman[209058]: 2025-11-25 19:05:25.153679609 +0000 UTC m=+0.074444942 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:05:29 compute-0 podman[197585]: time="2025-11-25T19:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:05:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:05:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Nov 25 19:05:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:05:31.064 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:05:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:05:31.064 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:05:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:05:31.065 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:05:31 compute-0 openstack_network_exporter[199731]: ERROR   19:05:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:05:31 compute-0 openstack_network_exporter[199731]: ERROR   19:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:05:31 compute-0 openstack_network_exporter[199731]: ERROR   19:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:05:31 compute-0 openstack_network_exporter[199731]: ERROR   19:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:05:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:05:31 compute-0 openstack_network_exporter[199731]: ERROR   19:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:05:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:05:37 compute-0 podman[209076]: 2025-11-25 19:05:37.166264396 +0000 UTC m=+0.084274505 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:05:44 compute-0 podman[209100]: 2025-11-25 19:05:44.225638437 +0000 UTC m=+0.151512305 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 19:05:46 compute-0 podman[209124]: 2025-11-25 19:05:46.168699455 +0000 UTC m=+0.082742535 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 19:05:53 compute-0 podman[209143]: 2025-11-25 19:05:53.169769405 +0000 UTC m=+0.097487809 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter)
Nov 25 19:05:56 compute-0 podman[209165]: 2025-11-25 19:05:56.164954224 +0000 UTC m=+0.089665870 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:05:59 compute-0 podman[197585]: time="2025-11-25T19:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:05:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:05:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Nov 25 19:06:01 compute-0 openstack_network_exporter[199731]: ERROR   19:06:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:06:01 compute-0 openstack_network_exporter[199731]: ERROR   19:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:06:01 compute-0 openstack_network_exporter[199731]: ERROR   19:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:06:01 compute-0 openstack_network_exporter[199731]: ERROR   19:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:06:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:06:01 compute-0 openstack_network_exporter[199731]: ERROR   19:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:06:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:06:08 compute-0 podman[209186]: 2025-11-25 19:06:08.16096515 +0000 UTC m=+0.079631262 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:06:09 compute-0 nova_compute[187212]: 2025-11-25 19:06:09.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:06:10 compute-0 nova_compute[187212]: 2025-11-25 19:06:10.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:06:10 compute-0 nova_compute[187212]: 2025-11-25 19:06:10.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:06:11 compute-0 nova_compute[187212]: 2025-11-25 19:06:11.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.168 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.689 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.900 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.901 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.936 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.936 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6069MB free_disk=73.03152084350586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.937 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:06:12 compute-0 nova_compute[187212]: 2025-11-25 19:06:12.937 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:06:13 compute-0 nova_compute[187212]: 2025-11-25 19:06:13.988 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:06:13 compute-0 nova_compute[187212]: 2025-11-25 19:06:13.989 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:06:12 up 58 min,  0 user,  load average: 0.04, 0.37, 0.53\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:06:14 compute-0 nova_compute[187212]: 2025-11-25 19:06:14.087 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:06:14 compute-0 nova_compute[187212]: 2025-11-25 19:06:14.595 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:06:15 compute-0 nova_compute[187212]: 2025-11-25 19:06:15.105 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:06:15 compute-0 nova_compute[187212]: 2025-11-25 19:06:15.105 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.168s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:06:15 compute-0 podman[209212]: 2025-11-25 19:06:15.20765981 +0000 UTC m=+0.127224795 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Nov 25 19:06:16 compute-0 nova_compute[187212]: 2025-11-25 19:06:16.106 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:06:16 compute-0 nova_compute[187212]: 2025-11-25 19:06:16.107 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:06:16 compute-0 nova_compute[187212]: 2025-11-25 19:06:16.107 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:06:17 compute-0 podman[209238]: 2025-11-25 19:06:17.165043522 +0000 UTC m=+0.087938864 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 25 19:06:24 compute-0 podman[209257]: 2025-11-25 19:06:24.172381959 +0000 UTC m=+0.094506489 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Nov 25 19:06:27 compute-0 podman[209279]: 2025-11-25 19:06:27.149402242 +0000 UTC m=+0.075264615 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible)
Nov 25 19:06:29 compute-0 podman[197585]: time="2025-11-25T19:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:06:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:06:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 25 19:06:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:06:31.066 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:06:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:06:31.066 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:06:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:06:31.066 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:06:31 compute-0 openstack_network_exporter[199731]: ERROR   19:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:06:31 compute-0 openstack_network_exporter[199731]: ERROR   19:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:06:31 compute-0 openstack_network_exporter[199731]: ERROR   19:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:06:31 compute-0 openstack_network_exporter[199731]: ERROR   19:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:06:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:06:31 compute-0 openstack_network_exporter[199731]: ERROR   19:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:06:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:06:39 compute-0 podman[209300]: 2025-11-25 19:06:39.157717906 +0000 UTC m=+0.078947254 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:06:46 compute-0 podman[209325]: 2025-11-25 19:06:46.261189585 +0000 UTC m=+0.175197109 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 19:06:48 compute-0 podman[209352]: 2025-11-25 19:06:48.130781388 +0000 UTC m=+0.059197265 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Nov 25 19:06:55 compute-0 podman[209372]: 2025-11-25 19:06:55.175152766 +0000 UTC m=+0.089821384 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64)
Nov 25 19:06:58 compute-0 podman[209393]: 2025-11-25 19:06:58.177332852 +0000 UTC m=+0.095404704 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 25 19:06:59 compute-0 podman[197585]: time="2025-11-25T19:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:06:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:06:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 25 19:07:01 compute-0 openstack_network_exporter[199731]: ERROR   19:07:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:07:01 compute-0 openstack_network_exporter[199731]: ERROR   19:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:07:01 compute-0 openstack_network_exporter[199731]: ERROR   19:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:07:01 compute-0 openstack_network_exporter[199731]: ERROR   19:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:07:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:07:01 compute-0 openstack_network_exporter[199731]: ERROR   19:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:07:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:07:09 compute-0 sshd-session[207474]: Received disconnect from 38.102.83.130 port 59074:11: disconnected by user
Nov 25 19:07:09 compute-0 sshd-session[207474]: Disconnected from user zuul 38.102.83.130 port 59074
Nov 25 19:07:09 compute-0 sshd-session[207471]: pam_unix(sshd:session): session closed for user zuul
Nov 25 19:07:09 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Nov 25 19:07:09 compute-0 systemd[1]: session-27.scope: Consumed 7.705s CPU time.
Nov 25 19:07:09 compute-0 systemd-logind[820]: Session 27 logged out. Waiting for processes to exit.
Nov 25 19:07:09 compute-0 systemd-logind[820]: Removed session 27.
Nov 25 19:07:09 compute-0 podman[209413]: 2025-11-25 19:07:09.578250147 +0000 UTC m=+0.090644539 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:07:10 compute-0 nova_compute[187212]: 2025-11-25 19:07:10.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:07:10 compute-0 nova_compute[187212]: 2025-11-25 19:07:10.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:07:10 compute-0 nova_compute[187212]: 2025-11-25 19:07:10.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:07:11 compute-0 nova_compute[187212]: 2025-11-25 19:07:11.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.693 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.919 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.922 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.946 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.947 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6078MB free_disk=73.03152084350586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.948 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:07:12 compute-0 nova_compute[187212]: 2025-11-25 19:07:12.948 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:07:14 compute-0 nova_compute[187212]: 2025-11-25 19:07:14.000 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:07:14 compute-0 nova_compute[187212]: 2025-11-25 19:07:14.001 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:07:12 up 59 min,  0 user,  load average: 0.01, 0.30, 0.49\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:07:14 compute-0 nova_compute[187212]: 2025-11-25 19:07:14.062 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:07:14 compute-0 nova_compute[187212]: 2025-11-25 19:07:14.571 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:07:15 compute-0 nova_compute[187212]: 2025-11-25 19:07:15.081 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:07:15 compute-0 nova_compute[187212]: 2025-11-25 19:07:15.081 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.133s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:07:16 compute-0 nova_compute[187212]: 2025-11-25 19:07:16.077 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:07:16 compute-0 nova_compute[187212]: 2025-11-25 19:07:16.078 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:07:16 compute-0 nova_compute[187212]: 2025-11-25 19:07:16.078 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:07:16 compute-0 nova_compute[187212]: 2025-11-25 19:07:16.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:07:17 compute-0 podman[209439]: 2025-11-25 19:07:17.207830681 +0000 UTC m=+0.131157226 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Nov 25 19:07:18 compute-0 nova_compute[187212]: 2025-11-25 19:07:18.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:07:19 compute-0 podman[209465]: 2025-11-25 19:07:19.15684036 +0000 UTC m=+0.075141361 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 25 19:07:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:24.321 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:07:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:24.322 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:07:26 compute-0 podman[209485]: 2025-11-25 19:07:26.171019315 +0000 UTC m=+0.091886781 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, version=9.6)
Nov 25 19:07:26 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:26.570 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:00:3e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c22848eb-1bc4-48a1-9d66-7ee29e27636b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c22848eb-1bc4-48a1-9d66-7ee29e27636b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '531fca9280df4ee286081a8ba6abe7ba', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbbfb93a-fc39-4419-b430-c786c7a6b93d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=55e68d2e-c54c-43c4-8122-0dad9160accb) old=Port_Binding(mac=['fa:16:3e:4f:00:3e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c22848eb-1bc4-48a1-9d66-7ee29e27636b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c22848eb-1bc4-48a1-9d66-7ee29e27636b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '531fca9280df4ee286081a8ba6abe7ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:07:26 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:26.572 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 55e68d2e-c54c-43c4-8122-0dad9160accb in datapath c22848eb-1bc4-48a1-9d66-7ee29e27636b updated
Nov 25 19:07:26 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:26.573 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c22848eb-1bc4-48a1-9d66-7ee29e27636b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:07:26 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:26.574 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c59075-a150-44b7-b312-1a7d9e388b21]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:07:29 compute-0 podman[209506]: 2025-11-25 19:07:29.216645114 +0000 UTC m=+0.129493652 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 19:07:29 compute-0 podman[197585]: time="2025-11-25T19:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:07:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:07:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 25 19:07:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:30.324 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:07:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:31.068 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:07:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:31.068 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:07:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:31.068 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:07:31 compute-0 openstack_network_exporter[199731]: ERROR   19:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:07:31 compute-0 openstack_network_exporter[199731]: ERROR   19:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:07:31 compute-0 openstack_network_exporter[199731]: ERROR   19:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:07:31 compute-0 openstack_network_exporter[199731]: ERROR   19:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:07:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:07:31 compute-0 openstack_network_exporter[199731]: ERROR   19:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:07:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:07:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:38.065 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:b5:65 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d9f2a465-a77f-4917-ab87-e5262f14d323', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f2a465-a77f-4917-ab87-e5262f14d323', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e202c186dce249b0af6e6d8003f9d2fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbac9e19-6394-427f-a697-8049d80c030a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b2d7ce1b-3cf6-4065-8a42-0548afdf23fa) old=Port_Binding(mac=['fa:16:3e:8c:b5:65'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d9f2a465-a77f-4917-ab87-e5262f14d323', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f2a465-a77f-4917-ab87-e5262f14d323', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e202c186dce249b0af6e6d8003f9d2fc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:07:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:38.066 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b2d7ce1b-3cf6-4065-8a42-0548afdf23fa in datapath d9f2a465-a77f-4917-ab87-e5262f14d323 updated
Nov 25 19:07:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:38.069 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9f2a465-a77f-4917-ab87-e5262f14d323, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:07:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:07:38.070 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[974463e8-f04e-488f-ab1e-db629ca37eb4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:07:40 compute-0 podman[209527]: 2025-11-25 19:07:40.165688182 +0000 UTC m=+0.085674608 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:07:48 compute-0 podman[209552]: 2025-11-25 19:07:48.215318002 +0000 UTC m=+0.139633960 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 19:07:50 compute-0 podman[209578]: 2025-11-25 19:07:50.182071822 +0000 UTC m=+0.103205860 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Nov 25 19:07:57 compute-0 podman[209597]: 2025-11-25 19:07:57.154021721 +0000 UTC m=+0.082894185 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, config_id=edpm)
Nov 25 19:07:59 compute-0 podman[197585]: time="2025-11-25T19:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:07:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:07:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Nov 25 19:08:00 compute-0 podman[209619]: 2025-11-25 19:08:00.175186007 +0000 UTC m=+0.095454996 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:08:01 compute-0 openstack_network_exporter[199731]: ERROR   19:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:08:01 compute-0 openstack_network_exporter[199731]: ERROR   19:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:08:01 compute-0 openstack_network_exporter[199731]: ERROR   19:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:08:01 compute-0 openstack_network_exporter[199731]: ERROR   19:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:08:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:08:01 compute-0 openstack_network_exporter[199731]: ERROR   19:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:08:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:08:10 compute-0 nova_compute[187212]: 2025-11-25 19:08:10.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:08:10 compute-0 nova_compute[187212]: 2025-11-25 19:08:10.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:08:11 compute-0 podman[209640]: 2025-11-25 19:08:11.162743429 +0000 UTC m=+0.085780561 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:08:11 compute-0 nova_compute[187212]: 2025-11-25 19:08:11.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:08:11 compute-0 nova_compute[187212]: 2025-11-25 19:08:11.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.710 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.710 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.711 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.711 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.891 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.893 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.916 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.917 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6067MB free_disk=73.03153991699219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.918 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:08:12 compute-0 nova_compute[187212]: 2025-11-25 19:08:12.918 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:08:14 compute-0 nova_compute[187212]: 2025-11-25 19:08:14.002 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:08:14 compute-0 nova_compute[187212]: 2025-11-25 19:08:14.003 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:08:12 up  1:00,  0 user,  load average: 0.12, 0.27, 0.47\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:08:14 compute-0 nova_compute[187212]: 2025-11-25 19:08:14.027 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:08:14 compute-0 nova_compute[187212]: 2025-11-25 19:08:14.540 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:08:15 compute-0 nova_compute[187212]: 2025-11-25 19:08:15.058 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:08:15 compute-0 nova_compute[187212]: 2025-11-25 19:08:15.058 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.140s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:08:17 compute-0 nova_compute[187212]: 2025-11-25 19:08:17.055 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:08:17 compute-0 nova_compute[187212]: 2025-11-25 19:08:17.055 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:08:17 compute-0 nova_compute[187212]: 2025-11-25 19:08:17.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:08:17 compute-0 nova_compute[187212]: 2025-11-25 19:08:17.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:08:19 compute-0 podman[209667]: 2025-11-25 19:08:19.229988571 +0000 UTC m=+0.154829789 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Nov 25 19:08:21 compute-0 podman[209694]: 2025-11-25 19:08:21.140356076 +0000 UTC m=+0.068507976 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:08:26 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:08:26.411 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:08:26 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:08:26.412 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:08:28 compute-0 podman[209716]: 2025-11-25 19:08:28.1728611 +0000 UTC m=+0.085168884 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Nov 25 19:08:29 compute-0 podman[197585]: time="2025-11-25T19:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:08:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:08:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2575 "" "Go-http-client/1.1"
Nov 25 19:08:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:08:31.069 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:08:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:08:31.069 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:08:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:08:31.069 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:08:31 compute-0 podman[209738]: 2025-11-25 19:08:31.193988465 +0000 UTC m=+0.112618097 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:08:31 compute-0 openstack_network_exporter[199731]: ERROR   19:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:08:31 compute-0 openstack_network_exporter[199731]: ERROR   19:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:08:31 compute-0 openstack_network_exporter[199731]: ERROR   19:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:08:31 compute-0 openstack_network_exporter[199731]: ERROR   19:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:08:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:08:31 compute-0 openstack_network_exporter[199731]: ERROR   19:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:08:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:08:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:08:36.413 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:08:42 compute-0 podman[209760]: 2025-11-25 19:08:42.127117613 +0000 UTC m=+0.054767124 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:08:50 compute-0 podman[209784]: 2025-11-25 19:08:50.213553572 +0000 UTC m=+0.127047628 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 25 19:08:52 compute-0 podman[209810]: 2025-11-25 19:08:52.141121139 +0000 UTC m=+0.066648037 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 19:08:59 compute-0 podman[209829]: 2025-11-25 19:08:59.174947579 +0000 UTC m=+0.091009129 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 25 19:08:59 compute-0 podman[197585]: time="2025-11-25T19:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:08:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:08:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2577 "" "Go-http-client/1.1"
Nov 25 19:09:01 compute-0 openstack_network_exporter[199731]: ERROR   19:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:09:01 compute-0 openstack_network_exporter[199731]: ERROR   19:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:09:01 compute-0 openstack_network_exporter[199731]: ERROR   19:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:09:01 compute-0 openstack_network_exporter[199731]: ERROR   19:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:09:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:09:01 compute-0 openstack_network_exporter[199731]: ERROR   19:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:09:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:09:02 compute-0 podman[209850]: 2025-11-25 19:09:02.165358295 +0000 UTC m=+0.088772210 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible)
Nov 25 19:09:10 compute-0 nova_compute[187212]: 2025-11-25 19:09:10.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:10 compute-0 nova_compute[187212]: 2025-11-25 19:09:10.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:09:13 compute-0 podman[209871]: 2025-11-25 19:09:13.157695153 +0000 UTC m=+0.081555260 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:09:13 compute-0 nova_compute[187212]: 2025-11-25 19:09:13.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:13 compute-0 nova_compute[187212]: 2025-11-25 19:09:13.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:14 compute-0 nova_compute[187212]: 2025-11-25 19:09:14.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:14 compute-0 nova_compute[187212]: 2025-11-25 19:09:14.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:14 compute-0 nova_compute[187212]: 2025-11-25 19:09:14.844 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:09:14 compute-0 nova_compute[187212]: 2025-11-25 19:09:14.845 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:09:14 compute-0 nova_compute[187212]: 2025-11-25 19:09:14.845 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:09:14 compute-0 nova_compute[187212]: 2025-11-25 19:09:14.846 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:09:15 compute-0 nova_compute[187212]: 2025-11-25 19:09:15.061 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:09:15 compute-0 nova_compute[187212]: 2025-11-25 19:09:15.063 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:09:15 compute-0 nova_compute[187212]: 2025-11-25 19:09:15.097 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:09:15 compute-0 nova_compute[187212]: 2025-11-25 19:09:15.099 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6072MB free_disk=73.03152084350586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:09:15 compute-0 nova_compute[187212]: 2025-11-25 19:09:15.099 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:09:15 compute-0 nova_compute[187212]: 2025-11-25 19:09:15.100 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:09:16 compute-0 nova_compute[187212]: 2025-11-25 19:09:16.156 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:09:16 compute-0 nova_compute[187212]: 2025-11-25 19:09:16.156 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:09:15 up  1:01,  0 user,  load average: 0.19, 0.26, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:09:16 compute-0 nova_compute[187212]: 2025-11-25 19:09:16.186 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:09:16 compute-0 nova_compute[187212]: 2025-11-25 19:09:16.694 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:09:17 compute-0 nova_compute[187212]: 2025-11-25 19:09:17.203 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:09:17 compute-0 nova_compute[187212]: 2025-11-25 19:09:17.203 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:09:17 compute-0 nova_compute[187212]: 2025-11-25 19:09:17.204 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:17 compute-0 nova_compute[187212]: 2025-11-25 19:09:17.204 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:09:17 compute-0 nova_compute[187212]: 2025-11-25 19:09:17.710 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:09:17 compute-0 nova_compute[187212]: 2025-11-25 19:09:17.711 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:17 compute-0 nova_compute[187212]: 2025-11-25 19:09:17.711 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:09:18 compute-0 nova_compute[187212]: 2025-11-25 19:09:18.220 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:20 compute-0 nova_compute[187212]: 2025-11-25 19:09:20.726 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:20 compute-0 nova_compute[187212]: 2025-11-25 19:09:20.727 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:20 compute-0 nova_compute[187212]: 2025-11-25 19:09:20.728 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:21 compute-0 podman[209896]: 2025-11-25 19:09:21.226282493 +0000 UTC m=+0.146615260 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 19:09:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:22.208 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:82:4d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77f77932d952469abb3b5c9ccc266fb0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be90cb2d-d452-4c14-ae54-64fed8821827, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=96ab31b0-301b-470c-878e-93842be7e83f) old=Port_Binding(mac=['fa:16:3e:f1:82:4d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77f77932d952469abb3b5c9ccc266fb0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:09:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:22.209 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 96ab31b0-301b-470c-878e-93842be7e83f in datapath f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a updated
Nov 25 19:09:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:22.211 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:09:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:22.212 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[100e6284-be2f-427b-9274-554070214d08]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:09:23 compute-0 podman[209924]: 2025-11-25 19:09:23.165785559 +0000 UTC m=+0.087341118 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 25 19:09:23 compute-0 nova_compute[187212]: 2025-11-25 19:09:23.171 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:27 compute-0 nova_compute[187212]: 2025-11-25 19:09:27.903 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:09:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:28.090 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:09:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:28.092 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:09:29 compute-0 podman[197585]: time="2025-11-25T19:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:09:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:09:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2578 "" "Go-http-client/1.1"
Nov 25 19:09:30 compute-0 podman[209944]: 2025-11-25 19:09:30.152277791 +0000 UTC m=+0.077310182 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 19:09:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:31.072 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:09:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:31.072 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:09:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:31.073 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:09:31 compute-0 openstack_network_exporter[199731]: ERROR   19:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:09:31 compute-0 openstack_network_exporter[199731]: ERROR   19:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:09:31 compute-0 openstack_network_exporter[199731]: ERROR   19:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:09:31 compute-0 openstack_network_exporter[199731]: ERROR   19:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:09:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:09:31 compute-0 openstack_network_exporter[199731]: ERROR   19:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:09:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:09:33 compute-0 podman[209967]: 2025-11-25 19:09:33.182601385 +0000 UTC m=+0.107525724 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 19:09:33 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:33.378 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:b2:38 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bf064688-1aaf-4359-b5d5-28fa76ffc3ed', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf064688-1aaf-4359-b5d5-28fa76ffc3ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2bd838afa47e4be7937ff6c483757210', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=630d93bc-ccca-4e6c-8483-a2417e063ffb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2457a7fe-efab-4e46-ac31-600e97be4ce8) old=Port_Binding(mac=['fa:16:3e:b3:b2:38'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-bf064688-1aaf-4359-b5d5-28fa76ffc3ed', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bf064688-1aaf-4359-b5d5-28fa76ffc3ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2bd838afa47e4be7937ff6c483757210', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:09:33 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:33.379 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2457a7fe-efab-4e46-ac31-600e97be4ce8 in datapath bf064688-1aaf-4359-b5d5-28fa76ffc3ed updated
Nov 25 19:09:33 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:33.380 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bf064688-1aaf-4359-b5d5-28fa76ffc3ed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:09:33 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:33.381 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[2748fe43-b0af-44ff-8a68-fb3ef03dccae]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:09:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:09:34.093 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:09:44 compute-0 podman[209987]: 2025-11-25 19:09:44.165350526 +0000 UTC m=+0.081692228 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:09:52 compute-0 podman[210011]: 2025-11-25 19:09:52.244937816 +0000 UTC m=+0.168081060 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Nov 25 19:09:54 compute-0 podman[210038]: 2025-11-25 19:09:54.148695453 +0000 UTC m=+0.073360847 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:09:59 compute-0 podman[197585]: time="2025-11-25T19:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:09:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:09:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2575 "" "Go-http-client/1.1"
Nov 25 19:10:01 compute-0 podman[210059]: 2025-11-25 19:10:01.170940483 +0000 UTC m=+0.088361206 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 19:10:01 compute-0 openstack_network_exporter[199731]: ERROR   19:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:10:01 compute-0 openstack_network_exporter[199731]: ERROR   19:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:10:01 compute-0 openstack_network_exporter[199731]: ERROR   19:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:10:01 compute-0 openstack_network_exporter[199731]: ERROR   19:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:10:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:10:01 compute-0 openstack_network_exporter[199731]: ERROR   19:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:10:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:10:04 compute-0 podman[210082]: 2025-11-25 19:10:04.171175038 +0000 UTC m=+0.091409276 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=watcher_latest)
Nov 25 19:10:05 compute-0 nova_compute[187212]: 2025-11-25 19:10:05.089 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:05 compute-0 nova_compute[187212]: 2025-11-25 19:10:05.089 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:05 compute-0 nova_compute[187212]: 2025-11-25 19:10:05.595 187216 DEBUG nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:10:06 compute-0 nova_compute[187212]: 2025-11-25 19:10:06.211 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:06 compute-0 nova_compute[187212]: 2025-11-25 19:10:06.212 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:06 compute-0 nova_compute[187212]: 2025-11-25 19:10:06.219 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:10:06 compute-0 nova_compute[187212]: 2025-11-25 19:10:06.219 187216 INFO nova.compute.claims [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:10:07 compute-0 nova_compute[187212]: 2025-11-25 19:10:07.294 187216 DEBUG nova.compute.provider_tree [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:10:07 compute-0 nova_compute[187212]: 2025-11-25 19:10:07.802 187216 DEBUG nova.scheduler.client.report [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:10:08 compute-0 nova_compute[187212]: 2025-11-25 19:10:08.313 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:08 compute-0 nova_compute[187212]: 2025-11-25 19:10:08.314 187216 DEBUG nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:10:08 compute-0 nova_compute[187212]: 2025-11-25 19:10:08.830 187216 DEBUG nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:10:08 compute-0 nova_compute[187212]: 2025-11-25 19:10:08.831 187216 DEBUG nova.network.neutron [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:10:08 compute-0 nova_compute[187212]: 2025-11-25 19:10:08.832 187216 WARNING neutronclient.v2_0.client [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:10:08 compute-0 nova_compute[187212]: 2025-11-25 19:10:08.834 187216 WARNING neutronclient.v2_0.client [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:10:09 compute-0 nova_compute[187212]: 2025-11-25 19:10:09.343 187216 INFO nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:10:09 compute-0 nova_compute[187212]: 2025-11-25 19:10:09.853 187216 DEBUG nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:10:10 compute-0 nova_compute[187212]: 2025-11-25 19:10:10.496 187216 DEBUG nova.network.neutron [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Successfully created port: 7ea961f2-f0b6-4863-b79e-68c0609bfc1c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:10:10 compute-0 nova_compute[187212]: 2025-11-25 19:10:10.880 187216 DEBUG nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:10:10 compute-0 nova_compute[187212]: 2025-11-25 19:10:10.883 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:10:10 compute-0 nova_compute[187212]: 2025-11-25 19:10:10.884 187216 INFO nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Creating image(s)
Nov 25 19:10:10 compute-0 nova_compute[187212]: 2025-11-25 19:10:10.886 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "/var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:10 compute-0 nova_compute[187212]: 2025-11-25 19:10:10.886 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "/var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:10 compute-0 nova_compute[187212]: 2025-11-25 19:10:10.887 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "/var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:10 compute-0 nova_compute[187212]: 2025-11-25 19:10:10.888 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:10 compute-0 nova_compute[187212]: 2025-11-25 19:10:10.889 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.172 187216 DEBUG nova.network.neutron [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Successfully updated port: 7ea961f2-f0b6-4863-b79e-68c0609bfc1c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.258 187216 DEBUG nova.compute.manager [req-710cc3a8-609d-4c48-b55c-8790f3fac8b7 req-6df40cb5-385a-45f0-b18b-6b818bbb6a7d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Received event network-changed-7ea961f2-f0b6-4863-b79e-68c0609bfc1c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.258 187216 DEBUG nova.compute.manager [req-710cc3a8-609d-4c48-b55c-8790f3fac8b7 req-6df40cb5-385a-45f0-b18b-6b818bbb6a7d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Refreshing instance network info cache due to event network-changed-7ea961f2-f0b6-4863-b79e-68c0609bfc1c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.258 187216 DEBUG oslo_concurrency.lockutils [req-710cc3a8-609d-4c48-b55c-8790f3fac8b7 req-6df40cb5-385a-45f0-b18b-6b818bbb6a7d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-d2425cfd-4407-4235-bdab-f2ede4bc1f20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.259 187216 DEBUG oslo_concurrency.lockutils [req-710cc3a8-609d-4c48-b55c-8790f3fac8b7 req-6df40cb5-385a-45f0-b18b-6b818bbb6a7d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-d2425cfd-4407-4235-bdab-f2ede4bc1f20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.259 187216 DEBUG nova.network.neutron [req-710cc3a8-609d-4c48-b55c-8790f3fac8b7 req-6df40cb5-385a-45f0-b18b-6b818bbb6a7d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Refreshing network info cache for port 7ea961f2-f0b6-4863-b79e-68c0609bfc1c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.504 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.511 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.512 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.596 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730.part --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.598 187216 DEBUG nova.virt.images [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] 5ca774a8-6150-424f-aaca-03ab3a3ee8cf was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.599 187216 DEBUG nova.privsep.utils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.600 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730.part /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.681 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "refresh_cache-d2425cfd-4407-4235-bdab-f2ede4bc1f20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.806 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730.part /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730.converted" returned: 0 in 0.205s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.814 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.848 187216 WARNING neutronclient.v2_0.client [req-710cc3a8-609d-4c48-b55c-8790f3fac8b7 req-6df40cb5-385a-45f0-b18b-6b818bbb6a7d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.891 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730.converted --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.892 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.893 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.899 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:10:12 compute-0 nova_compute[187212]: 2025-11-25 19:10:12.901 187216 INFO oslo.privsep.daemon [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpe4ncufnx/privsep.sock']
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.687 187216 INFO oslo.privsep.daemon [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Spawned new privsep daemon via rootwrap
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.495 210123 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.502 210123 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.505 210123 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.506 210123 INFO oslo.privsep.daemon [-] privsep daemon running as pid 210123
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.776 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.822 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.823 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.823 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.823 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.826 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.827 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.870 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.871 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.903 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.905 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.082s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.905 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.931 187216 DEBUG nova.network.neutron [req-710cc3a8-609d-4c48-b55c-8790f3fac8b7 req-6df40cb5-385a-45f0-b18b-6b818bbb6a7d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.992 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.993 187216 DEBUG nova.virt.disk.api [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Checking if we can resize image /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:10:13 compute-0 nova_compute[187212]: 2025-11-25 19:10:13.994 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.044 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.045 187216 DEBUG nova.virt.disk.api [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Cannot resize image /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.046 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.047 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Ensure instance console log exists: /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.047 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.048 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.048 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.309 187216 DEBUG nova.network.neutron [req-710cc3a8-609d-4c48-b55c-8790f3fac8b7 req-6df40cb5-385a-45f0-b18b-6b818bbb6a7d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.819 187216 DEBUG oslo_concurrency.lockutils [req-710cc3a8-609d-4c48-b55c-8790f3fac8b7 req-6df40cb5-385a-45f0-b18b-6b818bbb6a7d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-d2425cfd-4407-4235-bdab-f2ede4bc1f20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.820 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquired lock "refresh_cache-d2425cfd-4407-4235-bdab-f2ede4bc1f20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:10:14 compute-0 nova_compute[187212]: 2025-11-25 19:10:14.821 187216 DEBUG nova.network.neutron [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:10:15 compute-0 podman[210141]: 2025-11-25 19:10:15.165861827 +0000 UTC m=+0.085515220 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.698 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.698 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.699 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.699 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.897 187216 DEBUG nova.network.neutron [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.907 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.909 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.938 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.939 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5983MB free_disk=72.99691390991211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.940 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:15 compute-0 nova_compute[187212]: 2025-11-25 19:10:15.940 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:16 compute-0 nova_compute[187212]: 2025-11-25 19:10:16.339 187216 WARNING neutronclient.v2_0.client [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.009 187216 DEBUG nova.network.neutron [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Updating instance_info_cache with network_info: [{"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.093 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance d2425cfd-4407-4235-bdab-f2ede4bc1f20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.094 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.094 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:10:15 up  1:02,  0 user,  load average: 0.27, 0.27, 0.44\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_2bd838afa47e4be7937ff6c483757210': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.164 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.210 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.211 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.231 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.259 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.300 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.522 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Releasing lock "refresh_cache-d2425cfd-4407-4235-bdab-f2ede4bc1f20" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.523 187216 DEBUG nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Instance network_info: |[{"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.527 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Start _get_guest_xml network_info=[{"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.532 187216 WARNING nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.534 187216 DEBUG nova.virt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-1521363589', uuid='d2425cfd-4407-4235-bdab-f2ede4bc1f20'), owner=OwnerMeta(userid='7cbf59052404450fbcefb08d20815105', username='tempest-TestDataModel-229409854-project-admin', projectid='2bd838afa47e4be7937ff6c483757210', projectname='tempest-TestDataModel-229409854'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764097817.5344894) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.539 187216 DEBUG nova.virt.libvirt.host [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.540 187216 DEBUG nova.virt.libvirt.host [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.544 187216 DEBUG nova.virt.libvirt.host [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.544 187216 DEBUG nova.virt.libvirt.host [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.547 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.547 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.548 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.548 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.549 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.549 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.549 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.550 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.550 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.551 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.551 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.551 187216 DEBUG nova.virt.hardware [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.557 187216 DEBUG nova.privsep.utils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.559 187216 DEBUG nova.virt.libvirt.vif [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:10:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1521363589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1521363589',id=3,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2bd838afa47e4be7937ff6c483757210',ramdisk_id='',reservation_id='r-oa000mtz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-229409854',owner_user_name='tempest-TestDataModel-229409854-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:10:09Z,user_data=None,user_id='7cbf59052404450fbcefb08d20815105',uuid=d2425cfd-4407-4235-bdab-f2ede4bc1f20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.560 187216 DEBUG nova.network.os_vif_util [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Converting VIF {"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.561 187216 DEBUG nova.network.os_vif_util [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:4e:ad,bridge_name='br-int',has_traffic_filtering=True,id=7ea961f2-f0b6-4863-b79e-68c0609bfc1c,network=Network(f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ea961f2-f0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.563 187216 DEBUG nova.objects.instance [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lazy-loading 'pci_devices' on Instance uuid d2425cfd-4407-4235-bdab-f2ede4bc1f20 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.849 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updated inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.850 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Nov 25 19:10:17 compute-0 nova_compute[187212]: 2025-11-25 19:10:17.850 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.072 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <uuid>d2425cfd-4407-4235-bdab-f2ede4bc1f20</uuid>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <name>instance-00000003</name>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <nova:name>tempest-TestDataModel-server-1521363589</nova:name>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:10:17</nova:creationTime>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:10:18 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:10:18 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:user uuid="7cbf59052404450fbcefb08d20815105">tempest-TestDataModel-229409854-project-admin</nova:user>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:project uuid="2bd838afa47e4be7937ff6c483757210">tempest-TestDataModel-229409854</nova:project>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         <nova:port uuid="7ea961f2-f0b6-4863-b79e-68c0609bfc1c">
Nov 25 19:10:18 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <system>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <entry name="serial">d2425cfd-4407-4235-bdab-f2ede4bc1f20</entry>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <entry name="uuid">d2425cfd-4407-4235-bdab-f2ede4bc1f20</entry>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     </system>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <os>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   </os>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <features>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   </features>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk.config"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:88:4e:ad"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <target dev="tap7ea961f2-f0"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/console.log" append="off"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <video>
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     </video>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:10:18 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:10:18 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:10:18 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:10:18 compute-0 nova_compute[187212]: </domain>
Nov 25 19:10:18 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.075 187216 DEBUG nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Preparing to wait for external event network-vif-plugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.075 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.075 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.076 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.077 187216 DEBUG nova.virt.libvirt.vif [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:10:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1521363589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1521363589',id=3,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2bd838afa47e4be7937ff6c483757210',ramdisk_id='',reservation_id='r-oa000mtz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-229409854',owner_user_name='tempest-TestDataModel-229409854-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:10:09Z,user_data=None,user_id='7cbf59052404450fbcefb08d20815105',uuid=d2425cfd-4407-4235-bdab-f2ede4bc1f20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.077 187216 DEBUG nova.network.os_vif_util [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Converting VIF {"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.078 187216 DEBUG nova.network.os_vif_util [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:4e:ad,bridge_name='br-int',has_traffic_filtering=True,id=7ea961f2-f0b6-4863-b79e-68c0609bfc1c,network=Network(f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ea961f2-f0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.079 187216 DEBUG os_vif [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:4e:ad,bridge_name='br-int',has_traffic_filtering=True,id=7ea961f2-f0b6-4863-b79e-68c0609bfc1c,network=Network(f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ea961f2-f0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.135 187216 DEBUG ovsdbapp.backend.ovs_idl [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.135 187216 DEBUG ovsdbapp.backend.ovs_idl [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.136 187216 DEBUG ovsdbapp.backend.ovs_idl [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.136 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.136 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.137 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.137 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.139 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.144 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.157 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.157 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.158 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.159 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.160 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '78070116-7a18-594c-9b55-4b1f044e91e2', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.203 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.205 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.207 187216 INFO oslo.privsep.daemon [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmphttjht1o/privsep.sock']
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.361 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.362 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.422s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.975 187216 INFO oslo.privsep.daemon [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Spawned new privsep daemon via rootwrap
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.819 210170 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.826 210170 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.829 210170 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 25 19:10:18 compute-0 nova_compute[187212]: 2025-11-25 19:10:18.830 210170 INFO oslo.privsep.daemon [-] privsep daemon running as pid 210170
Nov 25 19:10:19 compute-0 nova_compute[187212]: 2025-11-25 19:10:19.263 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:19 compute-0 nova_compute[187212]: 2025-11-25 19:10:19.264 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ea961f2-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:19 compute-0 nova_compute[187212]: 2025-11-25 19:10:19.265 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7ea961f2-f0, col_values=(('qos', UUID('ac259a5b-e543-45c8-88ed-dd18e0af368c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:19 compute-0 nova_compute[187212]: 2025-11-25 19:10:19.266 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7ea961f2-f0, col_values=(('external_ids', {'iface-id': '7ea961f2-f0b6-4863-b79e-68c0609bfc1c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:4e:ad', 'vm-uuid': 'd2425cfd-4407-4235-bdab-f2ede4bc1f20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:19 compute-0 nova_compute[187212]: 2025-11-25 19:10:19.309 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:19 compute-0 NetworkManager[55552]: <info>  [1764097819.3116] manager: (tap7ea961f2-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 25 19:10:19 compute-0 nova_compute[187212]: 2025-11-25 19:10:19.312 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:10:19 compute-0 nova_compute[187212]: 2025-11-25 19:10:19.318 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:19 compute-0 nova_compute[187212]: 2025-11-25 19:10:19.319 187216 INFO os_vif [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:4e:ad,bridge_name='br-int',has_traffic_filtering=True,id=7ea961f2-f0b6-4863-b79e-68c0609bfc1c,network=Network(f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ea961f2-f0')
Nov 25 19:10:20 compute-0 nova_compute[187212]: 2025-11-25 19:10:20.361 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:10:20 compute-0 nova_compute[187212]: 2025-11-25 19:10:20.362 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:10:20 compute-0 nova_compute[187212]: 2025-11-25 19:10:20.362 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:10:20 compute-0 nova_compute[187212]: 2025-11-25 19:10:20.866 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:10:20 compute-0 nova_compute[187212]: 2025-11-25 19:10:20.866 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:10:20 compute-0 nova_compute[187212]: 2025-11-25 19:10:20.867 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] No VIF found with MAC fa:16:3e:88:4e:ad, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:10:20 compute-0 nova_compute[187212]: 2025-11-25 19:10:20.868 187216 INFO nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Using config drive
Nov 25 19:10:21 compute-0 nova_compute[187212]: 2025-11-25 19:10:21.085 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:21 compute-0 nova_compute[187212]: 2025-11-25 19:10:21.384 187216 WARNING neutronclient.v2_0.client [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:10:21 compute-0 nova_compute[187212]: 2025-11-25 19:10:21.889 187216 INFO nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Creating config drive at /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk.config
Nov 25 19:10:21 compute-0 nova_compute[187212]: 2025-11-25 19:10:21.898 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjpp3mnv7 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.038 187216 DEBUG oslo_concurrency.processutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjpp3mnv7" returned: 0 in 0.140s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:10:22 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 25 19:10:22 compute-0 kernel: tap7ea961f2-f0: entered promiscuous mode
Nov 25 19:10:22 compute-0 NetworkManager[55552]: <info>  [1764097822.1858] manager: (tap7ea961f2-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Nov 25 19:10:22 compute-0 ovn_controller[95465]: 2025-11-25T19:10:22Z|00040|binding|INFO|Claiming lport 7ea961f2-f0b6-4863-b79e-68c0609bfc1c for this chassis.
Nov 25 19:10:22 compute-0 ovn_controller[95465]: 2025-11-25T19:10:22Z|00041|binding|INFO|7ea961f2-f0b6-4863-b79e-68c0609bfc1c: Claiming fa:16:3e:88:4e:ad 10.100.0.11
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.187 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.203 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:4e:ad 10.100.0.11'], port_security=['fa:16:3e:88:4e:ad 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd2425cfd-4407-4235-bdab-f2ede4bc1f20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2bd838afa47e4be7937ff6c483757210', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a0d3ee92-eedf-42b9-8d87-eca3db518e98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be90cb2d-d452-4c14-ae54-64fed8821827, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=7ea961f2-f0b6-4863-b79e-68c0609bfc1c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.204 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 7ea961f2-f0b6-4863-b79e-68c0609bfc1c in datapath f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a bound to our chassis
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.206 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a
Nov 25 19:10:22 compute-0 systemd-udevd[210199]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.228 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[f5143f59-d608-4682-bcf1-27dba23da046]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.230 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf109b9f3-41 in ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.236 208756 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf109b9f3-40 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.236 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[32749cec-2f2d-473c-a3a0-7c984326d8a4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.237 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[3cdf65bd-97b6-4d72-bd52-b6c268d2a181]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:22 compute-0 NetworkManager[55552]: <info>  [1764097822.2480] device (tap7ea961f2-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:10:22 compute-0 NetworkManager[55552]: <info>  [1764097822.2503] device (tap7ea961f2-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.259 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[e95f992f-5a3b-44d9-81a7-4ded71f12618]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.276 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.276 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[46799aac-d1db-40b3-a4ad-fa5c0e9d0334]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:22 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:22.278 104356 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpv1qtxnjg/privsep.sock']
Nov 25 19:10:22 compute-0 ovn_controller[95465]: 2025-11-25T19:10:22Z|00042|binding|INFO|Setting lport 7ea961f2-f0b6-4863-b79e-68c0609bfc1c ovn-installed in OVS
Nov 25 19:10:22 compute-0 ovn_controller[95465]: 2025-11-25T19:10:22Z|00043|binding|INFO|Setting lport 7ea961f2-f0b6-4863-b79e-68c0609bfc1c up in Southbound
Nov 25 19:10:22 compute-0 systemd-machined[153494]: New machine qemu-1-instance-00000003.
Nov 25 19:10:22 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.282 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:22 compute-0 podman[210209]: 2025-11-25 19:10:22.403899171 +0000 UTC m=+0.112665790 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.528 187216 DEBUG nova.compute.manager [req-5f042111-d1e4-40a2-b524-75ac8b515509 req-e98bb027-9df6-4280-872c-75533849835e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Received event network-vif-plugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.530 187216 DEBUG oslo_concurrency.lockutils [req-5f042111-d1e4-40a2-b524-75ac8b515509 req-e98bb027-9df6-4280-872c-75533849835e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.531 187216 DEBUG oslo_concurrency.lockutils [req-5f042111-d1e4-40a2-b524-75ac8b515509 req-e98bb027-9df6-4280-872c-75533849835e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.531 187216 DEBUG oslo_concurrency.lockutils [req-5f042111-d1e4-40a2-b524-75ac8b515509 req-e98bb027-9df6-4280-872c-75533849835e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.531 187216 DEBUG nova.compute.manager [req-5f042111-d1e4-40a2-b524-75ac8b515509 req-e98bb027-9df6-4280-872c-75533849835e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Processing event network-vif-plugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.624 187216 DEBUG nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.638 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.644 187216 INFO nova.virt.libvirt.driver [-] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Instance spawned successfully.
Nov 25 19:10:22 compute-0 nova_compute[187212]: 2025-11-25 19:10:22.645 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:10:23 compute-0 nova_compute[187212]: 2025-11-25 19:10:23.160 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:10:23 compute-0 nova_compute[187212]: 2025-11-25 19:10:23.161 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:10:23 compute-0 nova_compute[187212]: 2025-11-25 19:10:23.161 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:10:23 compute-0 nova_compute[187212]: 2025-11-25 19:10:23.162 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:10:23 compute-0 nova_compute[187212]: 2025-11-25 19:10:23.163 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:10:23 compute-0 nova_compute[187212]: 2025-11-25 19:10:23.164 187216 DEBUG nova.virt.libvirt.driver [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.166 104356 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.167 104356 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpv1qtxnjg/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.006 210253 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.012 210253 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.015 210253 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.015 210253 INFO oslo.privsep.daemon [-] privsep daemon running as pid 210253
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.169 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf8eaaf-2588-4031-aa96-2af09559525b]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.594 210253 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.594 210253 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:23.594 210253 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:23 compute-0 nova_compute[187212]: 2025-11-25 19:10:23.674 187216 INFO nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Took 12.79 seconds to spawn the instance on the hypervisor.
Nov 25 19:10:23 compute-0 nova_compute[187212]: 2025-11-25 19:10:23.676 187216 DEBUG nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.009 210253 INFO oslo_service.backend [-] Loading backend: eventlet
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.014 210253 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.088 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[cbca4e07-7f37-48e6-b201-045a784e9355]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.114 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[2d679a5b-19b3-4007-b62f-88f86d12677c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 systemd-udevd[210200]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:10:24 compute-0 NetworkManager[55552]: <info>  [1764097824.1219] manager: (tapf109b9f3-40): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.178 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[2e62a950-eb4e-42bf-b5cb-a464dbd672b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.183 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[3dfb2b0f-e532-4016-8f4a-10704cc49198]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 NetworkManager[55552]: <info>  [1764097824.2217] device (tapf109b9f3-40): carrier: link connected
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.227 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[c98bfa57-f39a-413d-8d37-cd7fe2a6387e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.233 187216 INFO nova.compute.manager [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Took 18.13 seconds to build instance.
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.252 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[11a9e557-5938-4ecf-80cd-ac8efc6c9b0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf109b9f3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:82:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377188, 'reachable_time': 28450, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210277, 'error': None, 'target': 'ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.275 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c086122e-49e2-4303-a73a-3ab325bc95f7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef1:824d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377188, 'tstamp': 377188}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210278, 'error': None, 'target': 'ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.302 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[9eab52c5-57d4-4b8f-a1fc-4750fe8c6996]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf109b9f3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:82:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377188, 'reachable_time': 28450, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210279, 'error': None, 'target': 'ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.311 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.349 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[244aec2a-cec9-4e1b-8a48-39e9ab96c759]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.437 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[e2db5058-e353-4714-a470-da7f2e132445]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.438 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf109b9f3-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.439 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.439 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf109b9f3-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:24 compute-0 NetworkManager[55552]: <info>  [1764097824.4428] manager: (tapf109b9f3-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Nov 25 19:10:24 compute-0 kernel: tapf109b9f3-40: entered promiscuous mode
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.441 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.446 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf109b9f3-40, col_values=(('external_ids', {'iface-id': '96ab31b0-301b-470c-878e-93842be7e83f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:24 compute-0 ovn_controller[95465]: 2025-11-25T19:10:24Z|00044|binding|INFO|Releasing lport 96ab31b0-301b-470c-878e-93842be7e83f from this chassis (sb_readonly=0)
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.473 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.474 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ad000232-e190-4f28-a30c-8315e404c032]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.475 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.475 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.475 104356 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.475 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.476 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[e89415c9-224c-4311-96c8-6a3617dcf05d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.477 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.477 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[6d037109-69b6-478d-9d51-57a41f763695]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.478 104356 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: global
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     log         /dev/log local0 debug
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     log-tag     haproxy-metadata-proxy-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     user        root
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     group       root
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     maxconn     1024
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     pidfile     /var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     daemon
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: defaults
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     log global
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     mode http
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     option httplog
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     option dontlognull
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     option http-server-close
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     option forwardfor
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     retries                 3
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     timeout http-request    30s
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     timeout connect         30s
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     timeout client          32s
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     timeout server          32s
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     timeout http-keep-alive 30s
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: listen listener
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     bind 169.254.169.254:80
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:     http-request add-header X-OVN-Network-ID f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 19:10:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:24.479 104356 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'env', 'PROCESS_TAG=haproxy-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.633 187216 DEBUG nova.compute.manager [req-2ff82ad9-6b71-45cc-befe-df0c647965e4 req-4966ca3d-b0c5-44d0-9a06-b050811bbc91 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Received event network-vif-plugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.633 187216 DEBUG oslo_concurrency.lockutils [req-2ff82ad9-6b71-45cc-befe-df0c647965e4 req-4966ca3d-b0c5-44d0-9a06-b050811bbc91 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.634 187216 DEBUG oslo_concurrency.lockutils [req-2ff82ad9-6b71-45cc-befe-df0c647965e4 req-4966ca3d-b0c5-44d0-9a06-b050811bbc91 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.635 187216 DEBUG oslo_concurrency.lockutils [req-2ff82ad9-6b71-45cc-befe-df0c647965e4 req-4966ca3d-b0c5-44d0-9a06-b050811bbc91 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.635 187216 DEBUG nova.compute.manager [req-2ff82ad9-6b71-45cc-befe-df0c647965e4 req-4966ca3d-b0c5-44d0-9a06-b050811bbc91 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] No waiting events found dispatching network-vif-plugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.636 187216 WARNING nova.compute.manager [req-2ff82ad9-6b71-45cc-befe-df0c647965e4 req-4966ca3d-b0c5-44d0-9a06-b050811bbc91 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Received unexpected event network-vif-plugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c for instance with vm_state active and task_state None.
Nov 25 19:10:24 compute-0 nova_compute[187212]: 2025-11-25 19:10:24.740 187216 DEBUG oslo_concurrency.lockutils [None req-8bdda06b-9b29-4bad-88c6-1c303fb6278e 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.650s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:24 compute-0 podman[210312]: 2025-11-25 19:10:24.965734886 +0000 UTC m=+0.078443982 container create c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Nov 25 19:10:25 compute-0 systemd[1]: Started libpod-conmon-c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394.scope.
Nov 25 19:10:25 compute-0 podman[210312]: 2025-11-25 19:10:24.918642836 +0000 UTC m=+0.031352003 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 19:10:25 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:10:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4264de68e0061cdff315fbe7ede7ba9fa9c6ca803600fc16afa910900502fa5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 19:10:25 compute-0 podman[210312]: 2025-11-25 19:10:25.055026745 +0000 UTC m=+0.167735901 container init c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=watcher_latest)
Nov 25 19:10:25 compute-0 podman[210312]: 2025-11-25 19:10:25.066696224 +0000 UTC m=+0.179405320 container start c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:10:25 compute-0 neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a[210328]: [NOTICE]   (210343) : New worker (210347) forked
Nov 25 19:10:25 compute-0 neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a[210328]: [NOTICE]   (210343) : Loading success.
Nov 25 19:10:25 compute-0 podman[210325]: 2025-11-25 19:10:25.121248612 +0000 UTC m=+0.104401501 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:10:26 compute-0 nova_compute[187212]: 2025-11-25 19:10:26.128 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:29 compute-0 nova_compute[187212]: 2025-11-25 19:10:29.316 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:29 compute-0 podman[197585]: time="2025-11-25T19:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:10:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:10:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3041 "" "Go-http-client/1.1"
Nov 25 19:10:30 compute-0 nova_compute[187212]: 2025-11-25 19:10:30.282 187216 DEBUG oslo_concurrency.lockutils [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:30 compute-0 nova_compute[187212]: 2025-11-25 19:10:30.283 187216 DEBUG oslo_concurrency.lockutils [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:30 compute-0 nova_compute[187212]: 2025-11-25 19:10:30.284 187216 DEBUG oslo_concurrency.lockutils [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:30 compute-0 nova_compute[187212]: 2025-11-25 19:10:30.284 187216 DEBUG oslo_concurrency.lockutils [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:30 compute-0 nova_compute[187212]: 2025-11-25 19:10:30.285 187216 DEBUG oslo_concurrency.lockutils [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:30 compute-0 nova_compute[187212]: 2025-11-25 19:10:30.303 187216 INFO nova.compute.manager [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Terminating instance
Nov 25 19:10:30 compute-0 nova_compute[187212]: 2025-11-25 19:10:30.823 187216 DEBUG nova.compute.manager [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:10:30 compute-0 kernel: tap7ea961f2-f0 (unregistering): left promiscuous mode
Nov 25 19:10:30 compute-0 NetworkManager[55552]: <info>  [1764097830.8545] device (tap7ea961f2-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:10:30 compute-0 ovn_controller[95465]: 2025-11-25T19:10:30Z|00045|binding|INFO|Releasing lport 7ea961f2-f0b6-4863-b79e-68c0609bfc1c from this chassis (sb_readonly=0)
Nov 25 19:10:30 compute-0 ovn_controller[95465]: 2025-11-25T19:10:30Z|00046|binding|INFO|Setting lport 7ea961f2-f0b6-4863-b79e-68c0609bfc1c down in Southbound
Nov 25 19:10:30 compute-0 ovn_controller[95465]: 2025-11-25T19:10:30Z|00047|binding|INFO|Removing iface tap7ea961f2-f0 ovn-installed in OVS
Nov 25 19:10:30 compute-0 nova_compute[187212]: 2025-11-25 19:10:30.863 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:30.872 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:4e:ad 10.100.0.11'], port_security=['fa:16:3e:88:4e:ad 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd2425cfd-4407-4235-bdab-f2ede4bc1f20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2bd838afa47e4be7937ff6c483757210', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a0d3ee92-eedf-42b9-8d87-eca3db518e98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be90cb2d-d452-4c14-ae54-64fed8821827, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=7ea961f2-f0b6-4863-b79e-68c0609bfc1c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:10:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:30.874 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 7ea961f2-f0b6-4863-b79e-68c0609bfc1c in datapath f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a unbound from our chassis
Nov 25 19:10:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:30.877 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:10:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:30.879 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1c4e4eef-2f01-4358-ae55-8ebc1d541685]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:30 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:30.880 104356 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a namespace which is not needed anymore
Nov 25 19:10:30 compute-0 nova_compute[187212]: 2025-11-25 19:10:30.888 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:30 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 25 19:10:30 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 8.785s CPU time.
Nov 25 19:10:30 compute-0 systemd-machined[153494]: Machine qemu-1-instance-00000003 terminated.
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.005 187216 DEBUG nova.compute.manager [req-26a87715-ed2b-464d-9157-b46d1961ec71 req-a99931eb-3b8e-4128-a9d9-988fb357216c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Received event network-vif-unplugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.006 187216 DEBUG oslo_concurrency.lockutils [req-26a87715-ed2b-464d-9157-b46d1961ec71 req-a99931eb-3b8e-4128-a9d9-988fb357216c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.006 187216 DEBUG oslo_concurrency.lockutils [req-26a87715-ed2b-464d-9157-b46d1961ec71 req-a99931eb-3b8e-4128-a9d9-988fb357216c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.006 187216 DEBUG oslo_concurrency.lockutils [req-26a87715-ed2b-464d-9157-b46d1961ec71 req-a99931eb-3b8e-4128-a9d9-988fb357216c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.007 187216 DEBUG nova.compute.manager [req-26a87715-ed2b-464d-9157-b46d1961ec71 req-a99931eb-3b8e-4128-a9d9-988fb357216c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] No waiting events found dispatching network-vif-unplugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.007 187216 DEBUG nova.compute.manager [req-26a87715-ed2b-464d-9157-b46d1961ec71 req-a99931eb-3b8e-4128-a9d9-988fb357216c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Received event network-vif-unplugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:10:31 compute-0 neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a[210328]: [NOTICE]   (210343) : haproxy version is 3.0.5-8e879a5
Nov 25 19:10:31 compute-0 neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a[210328]: [NOTICE]   (210343) : path to executable is /usr/sbin/haproxy
Nov 25 19:10:31 compute-0 neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a[210328]: [WARNING]  (210343) : Exiting Master process...
Nov 25 19:10:31 compute-0 podman[210388]: 2025-11-25 19:10:31.061345873 +0000 UTC m=+0.051855057 container kill c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:10:31 compute-0 neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a[210328]: [ALERT]    (210343) : Current worker (210347) exited with code 143 (Terminated)
Nov 25 19:10:31 compute-0 neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a[210328]: [WARNING]  (210343) : All workers exited. Exiting... (0)
Nov 25 19:10:31 compute-0 systemd[1]: libpod-c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394.scope: Deactivated successfully.
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.074 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.074 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.074 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.101 187216 INFO nova.virt.libvirt.driver [-] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Instance destroyed successfully.
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.103 187216 DEBUG nova.objects.instance [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lazy-loading 'resources' on Instance uuid d2425cfd-4407-4235-bdab-f2ede4bc1f20 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.168 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:31 compute-0 podman[210411]: 2025-11-25 19:10:31.174492215 +0000 UTC m=+0.082238853 container died c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:10:31 compute-0 podman[210411]: 2025-11-25 19:10:31.223400653 +0000 UTC m=+0.131147201 container cleanup c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Nov 25 19:10:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-f4264de68e0061cdff315fbe7ede7ba9fa9c6ca803600fc16afa910900502fa5-merged.mount: Deactivated successfully.
Nov 25 19:10:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394-userdata-shm.mount: Deactivated successfully.
Nov 25 19:10:31 compute-0 systemd[1]: libpod-conmon-c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394.scope: Deactivated successfully.
Nov 25 19:10:31 compute-0 podman[210426]: 2025-11-25 19:10:31.257300622 +0000 UTC m=+0.137162880 container remove c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.269 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8125e1-68fd-4914-a0b7-8ae3ad30f939]: (4, ("Tue Nov 25 07:10:30 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a (c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394)\nc615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394\nTue Nov 25 07:10:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a (c615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394)\nc615f6de21760e095335255a04a184ebe32096292e7d689984bb5c3c1b5d3394\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.272 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[28ad77f3-6ead-41dd-9949-b5350daf3409]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.272 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.272 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[6aaa9f16-428d-43a2-a347-c45f139b3fc0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.273 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf109b9f3-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:31 compute-0 kernel: tapf109b9f3-40: left promiscuous mode
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.276 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.302 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.309 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1af9aa61-b496-4c8a-9ebc-6b16b22037e1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.330 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[40289c2c-5428-442e-b906-cd7f84c88217]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.331 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[fe85e0d2-a35c-4001-98ac-8b5d3ac73473]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:31 compute-0 podman[210445]: 2025-11-25 19:10:31.331007227 +0000 UTC m=+0.090899062 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.357 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[aea7047a-2639-4ddc-bdc1-4488a547d4e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377174, 'reachable_time': 20175, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210473, 'error': None, 'target': 'ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:31 compute-0 systemd[1]: run-netns-ovnmeta\x2df109b9f3\x2d4f0c\x2d40f4\x2d81d9\x2d4bf032aa2d0a.mount: Deactivated successfully.
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.374 104475 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 19:10:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:31.376 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[efb3bd2b-7417-4e4f-aef5-0451143e1c9b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:10:31 compute-0 openstack_network_exporter[199731]: ERROR   19:10:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:10:31 compute-0 openstack_network_exporter[199731]: ERROR   19:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:10:31 compute-0 openstack_network_exporter[199731]: ERROR   19:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:10:31 compute-0 openstack_network_exporter[199731]: ERROR   19:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:10:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:10:31 compute-0 openstack_network_exporter[199731]: ERROR   19:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:10:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.612 187216 DEBUG nova.virt.libvirt.vif [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:10:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1521363589',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1521363589',id=3,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:10:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2bd838afa47e4be7937ff6c483757210',ramdisk_id='',reservation_id='r-oa000mtz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-229409854',owner_user_name='tempest-TestDataModel-229409854-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:10:23Z,user_data=None,user_id='7cbf59052404450fbcefb08d20815105',uuid=d2425cfd-4407-4235-bdab-f2ede4bc1f20,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.613 187216 DEBUG nova.network.os_vif_util [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Converting VIF {"id": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "address": "fa:16:3e:88:4e:ad", "network": {"id": "f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a", "bridge": "br-int", "label": "tempest-TestDataModel-1408934864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77f77932d952469abb3b5c9ccc266fb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ea961f2-f0", "ovs_interfaceid": "7ea961f2-f0b6-4863-b79e-68c0609bfc1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.615 187216 DEBUG nova.network.os_vif_util [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:4e:ad,bridge_name='br-int',has_traffic_filtering=True,id=7ea961f2-f0b6-4863-b79e-68c0609bfc1c,network=Network(f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ea961f2-f0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.616 187216 DEBUG os_vif [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:4e:ad,bridge_name='br-int',has_traffic_filtering=True,id=7ea961f2-f0b6-4863-b79e-68c0609bfc1c,network=Network(f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ea961f2-f0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.620 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.621 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ea961f2-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.623 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.625 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.626 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.626 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ac259a5b-e543-45c8-88ed-dd18e0af368c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.627 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.629 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.633 187216 INFO os_vif [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:4e:ad,bridge_name='br-int',has_traffic_filtering=True,id=7ea961f2-f0b6-4863-b79e-68c0609bfc1c,network=Network(f109b9f3-4f0c-40f4-81d9-4bf032aa2d0a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ea961f2-f0')
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.634 187216 INFO nova.virt.libvirt.driver [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Deleting instance files /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20_del
Nov 25 19:10:31 compute-0 nova_compute[187212]: 2025-11-25 19:10:31.635 187216 INFO nova.virt.libvirt.driver [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Deletion of /var/lib/nova/instances/d2425cfd-4407-4235-bdab-f2ede4bc1f20_del complete
Nov 25 19:10:32 compute-0 nova_compute[187212]: 2025-11-25 19:10:32.152 187216 INFO nova.compute.manager [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Took 1.33 seconds to destroy the instance on the hypervisor.
Nov 25 19:10:32 compute-0 nova_compute[187212]: 2025-11-25 19:10:32.152 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:10:32 compute-0 nova_compute[187212]: 2025-11-25 19:10:32.153 187216 DEBUG nova.compute.manager [-] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:10:32 compute-0 nova_compute[187212]: 2025-11-25 19:10:32.153 187216 DEBUG nova.network.neutron [-] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:10:32 compute-0 nova_compute[187212]: 2025-11-25 19:10:32.153 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:10:32 compute-0 nova_compute[187212]: 2025-11-25 19:10:32.906 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:10:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:32.941 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:10:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:32.942 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:10:32 compute-0 nova_compute[187212]: 2025-11-25 19:10:32.943 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:33 compute-0 nova_compute[187212]: 2025-11-25 19:10:33.054 187216 DEBUG nova.compute.manager [req-d35f4ef0-048b-44f0-ac2c-a7869f991686 req-d92d7f34-9f1c-4d3b-be62-b4930860fa6f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Received event network-vif-unplugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:10:33 compute-0 nova_compute[187212]: 2025-11-25 19:10:33.055 187216 DEBUG oslo_concurrency.lockutils [req-d35f4ef0-048b-44f0-ac2c-a7869f991686 req-d92d7f34-9f1c-4d3b-be62-b4930860fa6f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:33 compute-0 nova_compute[187212]: 2025-11-25 19:10:33.055 187216 DEBUG oslo_concurrency.lockutils [req-d35f4ef0-048b-44f0-ac2c-a7869f991686 req-d92d7f34-9f1c-4d3b-be62-b4930860fa6f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:33 compute-0 nova_compute[187212]: 2025-11-25 19:10:33.055 187216 DEBUG oslo_concurrency.lockutils [req-d35f4ef0-048b-44f0-ac2c-a7869f991686 req-d92d7f34-9f1c-4d3b-be62-b4930860fa6f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:33 compute-0 nova_compute[187212]: 2025-11-25 19:10:33.055 187216 DEBUG nova.compute.manager [req-d35f4ef0-048b-44f0-ac2c-a7869f991686 req-d92d7f34-9f1c-4d3b-be62-b4930860fa6f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] No waiting events found dispatching network-vif-unplugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:10:33 compute-0 nova_compute[187212]: 2025-11-25 19:10:33.056 187216 DEBUG nova.compute.manager [req-d35f4ef0-048b-44f0-ac2c-a7869f991686 req-d92d7f34-9f1c-4d3b-be62-b4930860fa6f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Received event network-vif-unplugged-7ea961f2-f0b6-4863-b79e-68c0609bfc1c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:10:34 compute-0 nova_compute[187212]: 2025-11-25 19:10:34.254 187216 DEBUG nova.network.neutron [-] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:10:34 compute-0 nova_compute[187212]: 2025-11-25 19:10:34.763 187216 INFO nova.compute.manager [-] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Took 2.61 seconds to deallocate network for instance.
Nov 25 19:10:35 compute-0 nova_compute[187212]: 2025-11-25 19:10:35.111 187216 DEBUG nova.compute.manager [req-3c4814e9-9dd6-4058-8334-1ca03dbf305a req-3a168d8d-0418-438d-845e-d752770d1b89 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: d2425cfd-4407-4235-bdab-f2ede4bc1f20] Received event network-vif-deleted-7ea961f2-f0b6-4863-b79e-68c0609bfc1c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:10:35 compute-0 podman[210477]: 2025-11-25 19:10:35.151446223 +0000 UTC m=+0.071485998 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:10:35 compute-0 nova_compute[187212]: 2025-11-25 19:10:35.283 187216 DEBUG oslo_concurrency.lockutils [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:10:35 compute-0 nova_compute[187212]: 2025-11-25 19:10:35.284 187216 DEBUG oslo_concurrency.lockutils [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:10:35 compute-0 nova_compute[187212]: 2025-11-25 19:10:35.353 187216 DEBUG nova.compute.provider_tree [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:10:35 compute-0 nova_compute[187212]: 2025-11-25 19:10:35.862 187216 DEBUG nova.scheduler.client.report [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:10:36 compute-0 nova_compute[187212]: 2025-11-25 19:10:36.216 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:36 compute-0 nova_compute[187212]: 2025-11-25 19:10:36.373 187216 DEBUG oslo_concurrency.lockutils [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:36 compute-0 nova_compute[187212]: 2025-11-25 19:10:36.418 187216 INFO nova.scheduler.client.report [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Deleted allocations for instance d2425cfd-4407-4235-bdab-f2ede4bc1f20
Nov 25 19:10:36 compute-0 nova_compute[187212]: 2025-11-25 19:10:36.629 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:37 compute-0 nova_compute[187212]: 2025-11-25 19:10:37.451 187216 DEBUG oslo_concurrency.lockutils [None req-a311178c-ba2a-48ef-b498-14326a5e9706 7cbf59052404450fbcefb08d20815105 2bd838afa47e4be7937ff6c483757210 - - default default] Lock "d2425cfd-4407-4235-bdab-f2ede4bc1f20" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.168s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:10:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:10:40.945 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:10:41 compute-0 nova_compute[187212]: 2025-11-25 19:10:41.261 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:41 compute-0 nova_compute[187212]: 2025-11-25 19:10:41.632 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:46 compute-0 podman[210499]: 2025-11-25 19:10:46.199289342 +0000 UTC m=+0.090683577 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:10:46 compute-0 nova_compute[187212]: 2025-11-25 19:10:46.299 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:46 compute-0 nova_compute[187212]: 2025-11-25 19:10:46.635 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:50 compute-0 nova_compute[187212]: 2025-11-25 19:10:50.960 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:51 compute-0 nova_compute[187212]: 2025-11-25 19:10:51.338 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:51 compute-0 nova_compute[187212]: 2025-11-25 19:10:51.638 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:53 compute-0 podman[210523]: 2025-11-25 19:10:53.238331089 +0000 UTC m=+0.149616801 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 19:10:56 compute-0 podman[210551]: 2025-11-25 19:10:56.165260901 +0000 UTC m=+0.083461035 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:10:56 compute-0 nova_compute[187212]: 2025-11-25 19:10:56.362 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:56 compute-0 nova_compute[187212]: 2025-11-25 19:10:56.640 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:10:59 compute-0 podman[197585]: time="2025-11-25T19:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:10:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:10:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2585 "" "Go-http-client/1.1"
Nov 25 19:11:01 compute-0 nova_compute[187212]: 2025-11-25 19:11:01.365 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:01 compute-0 openstack_network_exporter[199731]: ERROR   19:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:11:01 compute-0 openstack_network_exporter[199731]: ERROR   19:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:11:01 compute-0 openstack_network_exporter[199731]: ERROR   19:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:11:01 compute-0 openstack_network_exporter[199731]: ERROR   19:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:11:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:11:01 compute-0 openstack_network_exporter[199731]: ERROR   19:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:11:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:11:01 compute-0 nova_compute[187212]: 2025-11-25 19:11:01.643 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:02 compute-0 podman[210570]: 2025-11-25 19:11:02.165658261 +0000 UTC m=+0.085316524 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 25 19:11:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:04.705 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:f9:63 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b7f575a862343fbb3396239106e3968', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a599677f-a9c8-4759-a6d8-6e08d6b4e0d1) old=Port_Binding(mac=['fa:16:3e:f4:f9:63'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b7f575a862343fbb3396239106e3968', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:11:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:04.706 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a599677f-a9c8-4759-a6d8-6e08d6b4e0d1 in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 updated
Nov 25 19:11:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:04.707 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:11:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:04.709 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c595f260-80fd-4f61-99fb-3a795ef64a4b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:11:06 compute-0 podman[210592]: 2025-11-25 19:11:06.182108938 +0000 UTC m=+0.099349457 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest)
Nov 25 19:11:06 compute-0 nova_compute[187212]: 2025-11-25 19:11:06.401 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:06 compute-0 nova_compute[187212]: 2025-11-25 19:11:06.644 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:11 compute-0 nova_compute[187212]: 2025-11-25 19:11:11.403 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:11 compute-0 nova_compute[187212]: 2025-11-25 19:11:11.646 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:12 compute-0 nova_compute[187212]: 2025-11-25 19:11:12.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:11:12 compute-0 nova_compute[187212]: 2025-11-25 19:11:12.172 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:11:13 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:13.991 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:8b:0b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-26c20bd3-d8e3-4015-beaa-379a30b2e575', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26c20bd3-d8e3-4015-beaa-379a30b2e575', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1212aa74-6abc-4214-90b1-b8e86df745a1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0752874a-9d2f-4262-9a64-beb2369f2bb7) old=Port_Binding(mac=['fa:16:3e:04:8b:0b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-26c20bd3-d8e3-4015-beaa-379a30b2e575', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-26c20bd3-d8e3-4015-beaa-379a30b2e575', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:11:13 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:13.993 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0752874a-9d2f-4262-9a64-beb2369f2bb7 in datapath 26c20bd3-d8e3-4015-beaa-379a30b2e575 updated
Nov 25 19:11:13 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:13.994 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 26c20bd3-d8e3-4015-beaa-379a30b2e575, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:11:13 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:13.995 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[7361cb65-73c9-4eab-880a-7d2c05645c2d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:11:14 compute-0 nova_compute[187212]: 2025-11-25 19:11:14.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:11:15 compute-0 nova_compute[187212]: 2025-11-25 19:11:15.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:11:15 compute-0 nova_compute[187212]: 2025-11-25 19:11:15.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:11:16 compute-0 nova_compute[187212]: 2025-11-25 19:11:16.405 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:16 compute-0 nova_compute[187212]: 2025-11-25 19:11:16.648 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:11:17 compute-0 podman[210614]: 2025-11-25 19:11:17.195363429 +0000 UTC m=+0.108476749 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.695 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.695 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.940 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.942 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.976 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.978 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5877MB free_disk=72.99710464477539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.978 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:11:17 compute-0 nova_compute[187212]: 2025-11-25 19:11:17.979 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:11:19 compute-0 nova_compute[187212]: 2025-11-25 19:11:19.049 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:11:19 compute-0 nova_compute[187212]: 2025-11-25 19:11:19.050 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:11:17 up  1:03,  0 user,  load average: 0.39, 0.32, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:11:19 compute-0 nova_compute[187212]: 2025-11-25 19:11:19.076 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:11:19 compute-0 nova_compute[187212]: 2025-11-25 19:11:19.586 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:11:20 compute-0 nova_compute[187212]: 2025-11-25 19:11:20.098 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:11:20 compute-0 nova_compute[187212]: 2025-11-25 19:11:20.099 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:11:21 compute-0 nova_compute[187212]: 2025-11-25 19:11:21.408 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:21 compute-0 nova_compute[187212]: 2025-11-25 19:11:21.651 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:22 compute-0 nova_compute[187212]: 2025-11-25 19:11:22.099 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:11:22 compute-0 nova_compute[187212]: 2025-11-25 19:11:22.100 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:11:22 compute-0 nova_compute[187212]: 2025-11-25 19:11:22.101 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:11:23 compute-0 nova_compute[187212]: 2025-11-25 19:11:23.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:11:24 compute-0 podman[210641]: 2025-11-25 19:11:24.215724672 +0000 UTC m=+0.135532697 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:11:26 compute-0 ovn_controller[95465]: 2025-11-25T19:11:26Z|00048|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 19:11:26 compute-0 nova_compute[187212]: 2025-11-25 19:11:26.439 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:26 compute-0 nova_compute[187212]: 2025-11-25 19:11:26.654 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:27 compute-0 podman[210667]: 2025-11-25 19:11:27.154803694 +0000 UTC m=+0.070822681 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:11:29 compute-0 podman[197585]: time="2025-11-25T19:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:11:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:11:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2584 "" "Go-http-client/1.1"
Nov 25 19:11:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:31.076 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:11:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:31.076 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:11:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:31.076 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:11:31 compute-0 openstack_network_exporter[199731]: ERROR   19:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:11:31 compute-0 openstack_network_exporter[199731]: ERROR   19:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:11:31 compute-0 openstack_network_exporter[199731]: ERROR   19:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:11:31 compute-0 openstack_network_exporter[199731]: ERROR   19:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:11:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:11:31 compute-0 openstack_network_exporter[199731]: ERROR   19:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:11:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:11:31 compute-0 nova_compute[187212]: 2025-11-25 19:11:31.455 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:31 compute-0 nova_compute[187212]: 2025-11-25 19:11:31.656 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:33 compute-0 podman[210690]: 2025-11-25 19:11:33.159559253 +0000 UTC m=+0.082186688 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6)
Nov 25 19:11:36 compute-0 nova_compute[187212]: 2025-11-25 19:11:36.456 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:36 compute-0 nova_compute[187212]: 2025-11-25 19:11:36.658 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:37 compute-0 podman[210711]: 2025-11-25 19:11:37.166721658 +0000 UTC m=+0.087157422 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Nov 25 19:11:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:39.421 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:11:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:39.422 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:11:39 compute-0 nova_compute[187212]: 2025-11-25 19:11:39.423 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:41 compute-0 nova_compute[187212]: 2025-11-25 19:11:41.491 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:41 compute-0 nova_compute[187212]: 2025-11-25 19:11:41.659 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:46 compute-0 nova_compute[187212]: 2025-11-25 19:11:46.494 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:46 compute-0 nova_compute[187212]: 2025-11-25 19:11:46.662 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:48 compute-0 podman[210732]: 2025-11-25 19:11:48.215108811 +0000 UTC m=+0.120757029 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:11:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:11:48.426 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:11:51 compute-0 nova_compute[187212]: 2025-11-25 19:11:51.523 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:51 compute-0 nova_compute[187212]: 2025-11-25 19:11:51.663 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:55 compute-0 podman[210756]: 2025-11-25 19:11:55.221004776 +0000 UTC m=+0.132685400 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 25 19:11:56 compute-0 nova_compute[187212]: 2025-11-25 19:11:56.571 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:56 compute-0 nova_compute[187212]: 2025-11-25 19:11:56.665 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:11:56 compute-0 nova_compute[187212]: 2025-11-25 19:11:56.733 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "909b423a-9e57-4bb8-b6b5-719b05724d71" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:11:56 compute-0 nova_compute[187212]: 2025-11-25 19:11:56.734 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:11:57 compute-0 nova_compute[187212]: 2025-11-25 19:11:57.241 187216 DEBUG nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:11:57 compute-0 nova_compute[187212]: 2025-11-25 19:11:57.806 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:11:57 compute-0 nova_compute[187212]: 2025-11-25 19:11:57.807 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:11:57 compute-0 nova_compute[187212]: 2025-11-25 19:11:57.817 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:11:57 compute-0 nova_compute[187212]: 2025-11-25 19:11:57.818 187216 INFO nova.compute.claims [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:11:58 compute-0 podman[210784]: 2025-11-25 19:11:58.165295889 +0000 UTC m=+0.084855600 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible)
Nov 25 19:11:58 compute-0 nova_compute[187212]: 2025-11-25 19:11:58.880 187216 DEBUG nova.compute.provider_tree [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:11:59 compute-0 nova_compute[187212]: 2025-11-25 19:11:59.387 187216 DEBUG nova.scheduler.client.report [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:11:59 compute-0 podman[197585]: time="2025-11-25T19:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:11:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:11:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2583 "" "Go-http-client/1.1"
Nov 25 19:11:59 compute-0 nova_compute[187212]: 2025-11-25 19:11:59.898 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:11:59 compute-0 nova_compute[187212]: 2025-11-25 19:11:59.899 187216 DEBUG nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:12:00 compute-0 nova_compute[187212]: 2025-11-25 19:12:00.413 187216 DEBUG nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:12:00 compute-0 nova_compute[187212]: 2025-11-25 19:12:00.413 187216 DEBUG nova.network.neutron [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:12:00 compute-0 nova_compute[187212]: 2025-11-25 19:12:00.414 187216 WARNING neutronclient.v2_0.client [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:00 compute-0 nova_compute[187212]: 2025-11-25 19:12:00.415 187216 WARNING neutronclient.v2_0.client [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:00 compute-0 nova_compute[187212]: 2025-11-25 19:12:00.923 187216 INFO nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:12:01 compute-0 openstack_network_exporter[199731]: ERROR   19:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:12:01 compute-0 openstack_network_exporter[199731]: ERROR   19:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:12:01 compute-0 openstack_network_exporter[199731]: ERROR   19:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:12:01 compute-0 nova_compute[187212]: 2025-11-25 19:12:01.425 187216 DEBUG nova.network.neutron [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Successfully created port: e568cb76-eb81-4449-aed6-d84ad4a0f086 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:12:01 compute-0 nova_compute[187212]: 2025-11-25 19:12:01.432 187216 DEBUG nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:12:01 compute-0 openstack_network_exporter[199731]: ERROR   19:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:12:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:12:01 compute-0 openstack_network_exporter[199731]: ERROR   19:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:12:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:12:01 compute-0 nova_compute[187212]: 2025-11-25 19:12:01.571 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:01 compute-0 nova_compute[187212]: 2025-11-25 19:12:01.668 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.457 187216 DEBUG nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.459 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.459 187216 INFO nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Creating image(s)
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.460 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "/var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.460 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "/var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.461 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "/var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.462 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.467 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.470 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.504 187216 DEBUG nova.network.neutron [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Successfully updated port: e568cb76-eb81-4449-aed6-d84ad4a0f086 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.561 187216 DEBUG nova.compute.manager [req-60071049-423a-429b-95ff-477344a34029 req-62581bac-4766-4e43-97b5-29fddd36d4c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Received event network-changed-e568cb76-eb81-4449-aed6-d84ad4a0f086 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.562 187216 DEBUG nova.compute.manager [req-60071049-423a-429b-95ff-477344a34029 req-62581bac-4766-4e43-97b5-29fddd36d4c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Refreshing instance network info cache due to event network-changed-e568cb76-eb81-4449-aed6-d84ad4a0f086. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.562 187216 DEBUG oslo_concurrency.lockutils [req-60071049-423a-429b-95ff-477344a34029 req-62581bac-4766-4e43-97b5-29fddd36d4c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-909b423a-9e57-4bb8-b6b5-719b05724d71" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.563 187216 DEBUG oslo_concurrency.lockutils [req-60071049-423a-429b-95ff-477344a34029 req-62581bac-4766-4e43-97b5-29fddd36d4c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-909b423a-9e57-4bb8-b6b5-719b05724d71" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.563 187216 DEBUG nova.network.neutron [req-60071049-423a-429b-95ff-477344a34029 req-62581bac-4766-4e43-97b5-29fddd36d4c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Refreshing network info cache for port e568cb76-eb81-4449-aed6-d84ad4a0f086 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.566 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.567 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.568 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.569 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.577 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.578 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.596 187216 WARNING neutronclient.v2_0.client [req-60071049-423a-429b-95ff-477344a34029 req-62581bac-4766-4e43-97b5-29fddd36d4c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.660 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.661 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.791 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk 1073741824" returned: 0 in 0.130s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.793 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.225s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.793 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.876 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.878 187216 DEBUG nova.virt.disk.api [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Checking if we can resize image /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.878 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.965 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.967 187216 DEBUG nova.virt.disk.api [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Cannot resize image /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.968 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.969 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Ensure instance console log exists: /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.970 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.970 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.971 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:02 compute-0 nova_compute[187212]: 2025-11-25 19:12:02.998 187216 DEBUG nova.network.neutron [req-60071049-423a-429b-95ff-477344a34029 req-62581bac-4766-4e43-97b5-29fddd36d4c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:12:03 compute-0 nova_compute[187212]: 2025-11-25 19:12:03.013 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "refresh_cache-909b423a-9e57-4bb8-b6b5-719b05724d71" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:12:03 compute-0 nova_compute[187212]: 2025-11-25 19:12:03.190 187216 DEBUG nova.network.neutron [req-60071049-423a-429b-95ff-477344a34029 req-62581bac-4766-4e43-97b5-29fddd36d4c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:12:03 compute-0 nova_compute[187212]: 2025-11-25 19:12:03.696 187216 DEBUG oslo_concurrency.lockutils [req-60071049-423a-429b-95ff-477344a34029 req-62581bac-4766-4e43-97b5-29fddd36d4c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-909b423a-9e57-4bb8-b6b5-719b05724d71" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:12:03 compute-0 nova_compute[187212]: 2025-11-25 19:12:03.698 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquired lock "refresh_cache-909b423a-9e57-4bb8-b6b5-719b05724d71" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:12:03 compute-0 nova_compute[187212]: 2025-11-25 19:12:03.698 187216 DEBUG nova.network.neutron [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:12:04 compute-0 podman[210818]: 2025-11-25 19:12:04.170654525 +0000 UTC m=+0.091780346 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Nov 25 19:12:04 compute-0 nova_compute[187212]: 2025-11-25 19:12:04.320 187216 DEBUG nova.network.neutron [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:12:04 compute-0 nova_compute[187212]: 2025-11-25 19:12:04.634 187216 WARNING neutronclient.v2_0.client [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:04 compute-0 nova_compute[187212]: 2025-11-25 19:12:04.993 187216 DEBUG nova.network.neutron [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Updating instance_info_cache with network_info: [{"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.500 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Releasing lock "refresh_cache-909b423a-9e57-4bb8-b6b5-719b05724d71" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.500 187216 DEBUG nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Instance network_info: |[{"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.504 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Start _get_guest_xml network_info=[{"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.510 187216 WARNING nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.512 187216 DEBUG nova.virt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-827340657', uuid='909b423a-9e57-4bb8-b6b5-719b05724d71'), owner=OwnerMeta(userid='7c561073d7c34a029574a6e2fb952944', username='tempest-TestExecuteActionsViaActuator-1103022868-project-admin', projectid='780511b4bf4d49299cc4d9b324261841', projectname='tempest-TestExecuteActionsViaActuator-1103022868'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764097925.512501) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.517 187216 DEBUG nova.virt.libvirt.host [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.518 187216 DEBUG nova.virt.libvirt.host [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.522 187216 DEBUG nova.virt.libvirt.host [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.523 187216 DEBUG nova.virt.libvirt.host [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.525 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.525 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.526 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.526 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.527 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.527 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.527 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.528 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.528 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.529 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.529 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.529 187216 DEBUG nova.virt.hardware [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.536 187216 DEBUG nova.virt.libvirt.vif [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:11:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-827340657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-827340657',id=5,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-zf07q39k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:12:01Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=909b423a-9e57-4bb8-b6b5-719b05724d71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.536 187216 DEBUG nova.network.os_vif_util [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.537 187216 DEBUG nova.network.os_vif_util [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:ba:48,bridge_name='br-int',has_traffic_filtering=True,id=e568cb76-eb81-4449-aed6-d84ad4a0f086,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape568cb76-eb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:12:05 compute-0 nova_compute[187212]: 2025-11-25 19:12:05.539 187216 DEBUG nova.objects.instance [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lazy-loading 'pci_devices' on Instance uuid 909b423a-9e57-4bb8-b6b5-719b05724d71 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.049 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <uuid>909b423a-9e57-4bb8-b6b5-719b05724d71</uuid>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <name>instance-00000005</name>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-827340657</nova:name>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:12:05</nova:creationTime>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:12:06 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:12:06 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:user uuid="7c561073d7c34a029574a6e2fb952944">tempest-TestExecuteActionsViaActuator-1103022868-project-admin</nova:user>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:project uuid="780511b4bf4d49299cc4d9b324261841">tempest-TestExecuteActionsViaActuator-1103022868</nova:project>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         <nova:port uuid="e568cb76-eb81-4449-aed6-d84ad4a0f086">
Nov 25 19:12:06 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <system>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <entry name="serial">909b423a-9e57-4bb8-b6b5-719b05724d71</entry>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <entry name="uuid">909b423a-9e57-4bb8-b6b5-719b05724d71</entry>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     </system>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <os>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   </os>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <features>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   </features>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk.config"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:82:ba:48"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <target dev="tape568cb76-eb"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/console.log" append="off"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <video>
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     </video>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:12:06 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:12:06 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:12:06 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:12:06 compute-0 nova_compute[187212]: </domain>
Nov 25 19:12:06 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.050 187216 DEBUG nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Preparing to wait for external event network-vif-plugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.050 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.050 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.051 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.052 187216 DEBUG nova.virt.libvirt.vif [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:11:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-827340657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-827340657',id=5,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-zf07q39k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:12:01Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=909b423a-9e57-4bb8-b6b5-719b05724d71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.053 187216 DEBUG nova.network.os_vif_util [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.054 187216 DEBUG nova.network.os_vif_util [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:ba:48,bridge_name='br-int',has_traffic_filtering=True,id=e568cb76-eb81-4449-aed6-d84ad4a0f086,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape568cb76-eb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.054 187216 DEBUG os_vif [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:ba:48,bridge_name='br-int',has_traffic_filtering=True,id=e568cb76-eb81-4449-aed6-d84ad4a0f086,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape568cb76-eb') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.055 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.056 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.057 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.058 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.058 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2ec8997f-7882-5983-8954-9553a39e2770', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.062 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.066 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.066 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape568cb76-eb, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.068 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape568cb76-eb, col_values=(('qos', UUID('e7a644f5-e5d1-4aaf-9fc5-c769895e04a8')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.068 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape568cb76-eb, col_values=(('external_ids', {'iface-id': 'e568cb76-eb81-4449-aed6-d84ad4a0f086', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:ba:48', 'vm-uuid': '909b423a-9e57-4bb8-b6b5-719b05724d71'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:06 compute-0 NetworkManager[55552]: <info>  [1764097926.0719] manager: (tape568cb76-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.073 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.080 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.081 187216 INFO os_vif [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:ba:48,bridge_name='br-int',has_traffic_filtering=True,id=e568cb76-eb81-4449-aed6-d84ad4a0f086,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape568cb76-eb')
Nov 25 19:12:06 compute-0 nova_compute[187212]: 2025-11-25 19:12:06.574 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:07 compute-0 nova_compute[187212]: 2025-11-25 19:12:07.684 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:12:07 compute-0 nova_compute[187212]: 2025-11-25 19:12:07.684 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:12:07 compute-0 nova_compute[187212]: 2025-11-25 19:12:07.685 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] No VIF found with MAC fa:16:3e:82:ba:48, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:12:07 compute-0 nova_compute[187212]: 2025-11-25 19:12:07.686 187216 INFO nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Using config drive
Nov 25 19:12:08 compute-0 podman[210842]: 2025-11-25 19:12:08.177984093 +0000 UTC m=+0.092790344 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Nov 25 19:12:08 compute-0 nova_compute[187212]: 2025-11-25 19:12:08.199 187216 WARNING neutronclient.v2_0.client [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.134 187216 INFO nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Creating config drive at /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk.config
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.141 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpd3ifllb9 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.287 187216 DEBUG oslo_concurrency.processutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpd3ifllb9" returned: 0 in 0.146s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:09 compute-0 kernel: tape568cb76-eb: entered promiscuous mode
Nov 25 19:12:09 compute-0 NetworkManager[55552]: <info>  [1764097929.3750] manager: (tape568cb76-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Nov 25 19:12:09 compute-0 ovn_controller[95465]: 2025-11-25T19:12:09Z|00049|binding|INFO|Claiming lport e568cb76-eb81-4449-aed6-d84ad4a0f086 for this chassis.
Nov 25 19:12:09 compute-0 ovn_controller[95465]: 2025-11-25T19:12:09Z|00050|binding|INFO|e568cb76-eb81-4449-aed6-d84ad4a0f086: Claiming fa:16:3e:82:ba:48 10.100.0.5
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.376 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.382 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.385 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.394 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.403 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:ba:48 10.100.0.5'], port_security=['fa:16:3e:82:ba:48 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '909b423a-9e57-4bb8-b6b5-719b05724d71', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=e568cb76-eb81-4449-aed6-d84ad4a0f086) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.404 104356 INFO neutron.agent.ovn.metadata.agent [-] Port e568cb76-eb81-4449-aed6-d84ad4a0f086 in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 bound to our chassis
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.407 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.425 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[2925827d-deab-45a8-a103-b85ef47bf337]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.426 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap22e324dc-31 in ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Nov 25 19:12:09 compute-0 systemd-udevd[210881]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.436 208756 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap22e324dc-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 19:12:09 compute-0 systemd-machined[153494]: New machine qemu-2-instance-00000005.
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.437 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[2dfb856a-a814-4d53-ab55-1b5908875047]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.439 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[2ca911f5-3c3a-4d0d-9665-74a8e762ae91]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 NetworkManager[55552]: <info>  [1764097929.4577] device (tape568cb76-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:12:09 compute-0 NetworkManager[55552]: <info>  [1764097929.4605] device (tape568cb76-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:12:09 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.459 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[6451a8c0-f763-4565-9c79-cb65b4311dfe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_controller[95465]: 2025-11-25T19:12:09Z|00051|binding|INFO|Setting lport e568cb76-eb81-4449-aed6-d84ad4a0f086 ovn-installed in OVS
Nov 25 19:12:09 compute-0 ovn_controller[95465]: 2025-11-25T19:12:09Z|00052|binding|INFO|Setting lport e568cb76-eb81-4449-aed6-d84ad4a0f086 up in Southbound
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.476 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.483 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[67bab5d5-d9c5-4424-a0ee-abed223bc8ff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.527 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[3408cf67-53ac-4024-a9ad-7dca3c7a3995]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.533 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[483cf74e-8514-4414-8820-d55c2ec967dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 NetworkManager[55552]: <info>  [1764097929.5350] manager: (tap22e324dc-30): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Nov 25 19:12:09 compute-0 systemd-udevd[210884]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.581 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[9279775b-a90e-4374-84b9-4b563f1e2f90]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.585 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[2026d146-6a26-408d-bdd5-cb9432474dc9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 NetworkManager[55552]: <info>  [1764097929.6168] device (tap22e324dc-30): carrier: link connected
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.625 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[46643d3e-36d0-47fe-9f30-1aea1e7f97d6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.650 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[53e68123-d045-4946-a82b-43c6820b1637]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 44288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210913, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.672 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d01bfa-83a0-4ce4-84b3-821c4f3d514c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef4:f963'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387728, 'tstamp': 387728}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210914, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.692 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c83c1801-fc47-41d1-844c-31cda3040abf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 44288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210916, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.733 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[67d9a30c-27ee-4639-9f3b-70c471a8686a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.827 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1fcbebfe-ed4f-4996-96a6-551107ee3739]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.828 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.829 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.829 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.831 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 NetworkManager[55552]: <info>  [1764097929.8329] manager: (tap22e324dc-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 25 19:12:09 compute-0 kernel: tap22e324dc-30: entered promiscuous mode
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.835 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.837 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.838 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 ovn_controller[95465]: 2025-11-25T19:12:09Z|00053|binding|INFO|Releasing lport a599677f-a9c8-4759-a6d8-6e08d6b4e0d1 from this chassis (sb_readonly=0)
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.861 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 nova_compute[187212]: 2025-11-25 19:12:09.864 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.866 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[4998228d-475e-42d1-ac4d-5621674bf143]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.867 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.867 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.867 104356 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.867 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.868 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[55a7ccab-2edc-4898-8684-610895765318]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.868 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.869 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[9c08d6d8-2b6d-484a-814f-1e9b478cf893]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.869 104356 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: global
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     log         /dev/log local0 debug
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     user        root
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     group       root
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     maxconn     1024
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     daemon
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: defaults
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     log global
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     mode http
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     option httplog
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     option dontlognull
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     option http-server-close
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     option forwardfor
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     retries                 3
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     timeout http-request    30s
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     timeout connect         30s
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     timeout client          32s
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     timeout server          32s
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     timeout http-keep-alive 30s
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: listen listener
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     bind 169.254.169.254:80
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:     http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 19:12:09 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:09.870 104356 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'env', 'PROCESS_TAG=haproxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.169 187216 DEBUG nova.compute.manager [req-e58165a4-89a7-4ae4-ba91-b654c83d06a2 req-0eeed3cc-5cf5-4186-96de-5a2b743a1069 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Received event network-vif-plugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.170 187216 DEBUG oslo_concurrency.lockutils [req-e58165a4-89a7-4ae4-ba91-b654c83d06a2 req-0eeed3cc-5cf5-4186-96de-5a2b743a1069 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.171 187216 DEBUG oslo_concurrency.lockutils [req-e58165a4-89a7-4ae4-ba91-b654c83d06a2 req-0eeed3cc-5cf5-4186-96de-5a2b743a1069 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.171 187216 DEBUG oslo_concurrency.lockutils [req-e58165a4-89a7-4ae4-ba91-b654c83d06a2 req-0eeed3cc-5cf5-4186-96de-5a2b743a1069 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.172 187216 DEBUG nova.compute.manager [req-e58165a4-89a7-4ae4-ba91-b654c83d06a2 req-0eeed3cc-5cf5-4186-96de-5a2b743a1069 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Processing event network-vif-plugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.173 187216 DEBUG nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.178 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.182 187216 INFO nova.virt.libvirt.driver [-] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Instance spawned successfully.
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.183 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:12:10 compute-0 podman[210954]: 2025-11-25 19:12:10.375667816 +0000 UTC m=+0.078826077 container create 30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251125)
Nov 25 19:12:10 compute-0 systemd[1]: Started libpod-conmon-30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd.scope.
Nov 25 19:12:10 compute-0 podman[210954]: 2025-11-25 19:12:10.334318011 +0000 UTC m=+0.037476322 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 19:12:10 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:12:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95d29816be6458cdb1ec30a1e513ae35fdeb26125479a4fc22754edf1e40a904/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 19:12:10 compute-0 podman[210954]: 2025-11-25 19:12:10.484572804 +0000 UTC m=+0.187731125 container init 30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 25 19:12:10 compute-0 podman[210954]: 2025-11-25 19:12:10.494834521 +0000 UTC m=+0.197992792 container start 30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 19:12:10 compute-0 neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783[210969]: [NOTICE]   (210973) : New worker (210975) forked
Nov 25 19:12:10 compute-0 neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783[210969]: [NOTICE]   (210973) : Loading success.
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.701 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.702 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.702 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.703 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.704 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:12:10 compute-0 nova_compute[187212]: 2025-11-25 19:12:10.704 187216 DEBUG nova.virt.libvirt.driver [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:12:11 compute-0 nova_compute[187212]: 2025-11-25 19:12:11.071 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:11 compute-0 nova_compute[187212]: 2025-11-25 19:12:11.216 187216 INFO nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Took 8.76 seconds to spawn the instance on the hypervisor.
Nov 25 19:12:11 compute-0 nova_compute[187212]: 2025-11-25 19:12:11.216 187216 DEBUG nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:12:11 compute-0 nova_compute[187212]: 2025-11-25 19:12:11.610 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:11 compute-0 nova_compute[187212]: 2025-11-25 19:12:11.766 187216 INFO nova.compute.manager [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Took 14.01 seconds to build instance.
Nov 25 19:12:12 compute-0 nova_compute[187212]: 2025-11-25 19:12:12.264 187216 DEBUG nova.compute.manager [req-1377e0b5-8e6c-4707-bcab-3bdc99e2838c req-812b3793-dca1-4466-ac37-f9c6afa05dab 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Received event network-vif-plugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:12:12 compute-0 nova_compute[187212]: 2025-11-25 19:12:12.264 187216 DEBUG oslo_concurrency.lockutils [req-1377e0b5-8e6c-4707-bcab-3bdc99e2838c req-812b3793-dca1-4466-ac37-f9c6afa05dab 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:12 compute-0 nova_compute[187212]: 2025-11-25 19:12:12.265 187216 DEBUG oslo_concurrency.lockutils [req-1377e0b5-8e6c-4707-bcab-3bdc99e2838c req-812b3793-dca1-4466-ac37-f9c6afa05dab 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:12 compute-0 nova_compute[187212]: 2025-11-25 19:12:12.265 187216 DEBUG oslo_concurrency.lockutils [req-1377e0b5-8e6c-4707-bcab-3bdc99e2838c req-812b3793-dca1-4466-ac37-f9c6afa05dab 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:12 compute-0 nova_compute[187212]: 2025-11-25 19:12:12.265 187216 DEBUG nova.compute.manager [req-1377e0b5-8e6c-4707-bcab-3bdc99e2838c req-812b3793-dca1-4466-ac37-f9c6afa05dab 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] No waiting events found dispatching network-vif-plugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:12:12 compute-0 nova_compute[187212]: 2025-11-25 19:12:12.266 187216 WARNING nova.compute.manager [req-1377e0b5-8e6c-4707-bcab-3bdc99e2838c req-812b3793-dca1-4466-ac37-f9c6afa05dab 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Received unexpected event network-vif-plugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 for instance with vm_state active and task_state None.
Nov 25 19:12:12 compute-0 nova_compute[187212]: 2025-11-25 19:12:12.272 187216 DEBUG oslo_concurrency.lockutils [None req-20b08f42-116d-4fd8-b1a9-fb816d7a6fb0 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.538s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:14 compute-0 nova_compute[187212]: 2025-11-25 19:12:14.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:12:14 compute-0 nova_compute[187212]: 2025-11-25 19:12:14.176 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:12:14 compute-0 nova_compute[187212]: 2025-11-25 19:12:14.176 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:12:16 compute-0 nova_compute[187212]: 2025-11-25 19:12:16.072 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:16 compute-0 nova_compute[187212]: 2025-11-25 19:12:16.643 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:17 compute-0 nova_compute[187212]: 2025-11-25 19:12:17.171 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:12:17 compute-0 nova_compute[187212]: 2025-11-25 19:12:17.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:12:19 compute-0 podman[210984]: 2025-11-25 19:12:19.173368906 +0000 UTC m=+0.089736701 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:12:19 compute-0 nova_compute[187212]: 2025-11-25 19:12:19.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:12:19 compute-0 nova_compute[187212]: 2025-11-25 19:12:19.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:12:19 compute-0 nova_compute[187212]: 2025-11-25 19:12:19.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:19 compute-0 nova_compute[187212]: 2025-11-25 19:12:19.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:19 compute-0 nova_compute[187212]: 2025-11-25 19:12:19.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:19 compute-0 nova_compute[187212]: 2025-11-25 19:12:19.686 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:12:20 compute-0 nova_compute[187212]: 2025-11-25 19:12:20.736 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:20 compute-0 nova_compute[187212]: 2025-11-25 19:12:20.822 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:20 compute-0 nova_compute[187212]: 2025-11-25 19:12:20.823 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:20 compute-0 nova_compute[187212]: 2025-11-25 19:12:20.885 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:21 compute-0 nova_compute[187212]: 2025-11-25 19:12:21.075 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:21 compute-0 nova_compute[187212]: 2025-11-25 19:12:21.099 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:12:21 compute-0 nova_compute[187212]: 2025-11-25 19:12:21.100 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:21 compute-0 nova_compute[187212]: 2025-11-25 19:12:21.140 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:21 compute-0 nova_compute[187212]: 2025-11-25 19:12:21.141 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5645MB free_disk=72.99633407592773GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:12:21 compute-0 nova_compute[187212]: 2025-11-25 19:12:21.141 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:21 compute-0 nova_compute[187212]: 2025-11-25 19:12:21.142 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:21 compute-0 nova_compute[187212]: 2025-11-25 19:12:21.677 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:22 compute-0 nova_compute[187212]: 2025-11-25 19:12:22.393 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 909b423a-9e57-4bb8-b6b5-719b05724d71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:12:22 compute-0 nova_compute[187212]: 2025-11-25 19:12:22.393 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:12:22 compute-0 nova_compute[187212]: 2025-11-25 19:12:22.394 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:12:21 up  1:04,  0 user,  load average: 0.63, 0.39, 0.46\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_780511b4bf4d49299cc4d9b324261841': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:12:22 compute-0 nova_compute[187212]: 2025-11-25 19:12:22.442 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:12:23 compute-0 nova_compute[187212]: 2025-11-25 19:12:23.333 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:12:23 compute-0 nova_compute[187212]: 2025-11-25 19:12:23.846 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:12:23 compute-0 nova_compute[187212]: 2025-11-25 19:12:23.846 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.704s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:24 compute-0 ovn_controller[95465]: 2025-11-25T19:12:24Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:ba:48 10.100.0.5
Nov 25 19:12:24 compute-0 ovn_controller[95465]: 2025-11-25T19:12:24Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:ba:48 10.100.0.5
Nov 25 19:12:25 compute-0 nova_compute[187212]: 2025-11-25 19:12:25.846 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:12:25 compute-0 nova_compute[187212]: 2025-11-25 19:12:25.847 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:12:26 compute-0 nova_compute[187212]: 2025-11-25 19:12:26.078 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:26 compute-0 podman[211031]: 2025-11-25 19:12:26.233935076 +0000 UTC m=+0.145474755 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 19:12:26 compute-0 nova_compute[187212]: 2025-11-25 19:12:26.717 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:27 compute-0 nova_compute[187212]: 2025-11-25 19:12:27.071 187216 DEBUG nova.compute.manager [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Nov 25 19:12:27 compute-0 nova_compute[187212]: 2025-11-25 19:12:27.623 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:27 compute-0 nova_compute[187212]: 2025-11-25 19:12:27.624 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:28 compute-0 nova_compute[187212]: 2025-11-25 19:12:28.147 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:12:28 compute-0 nova_compute[187212]: 2025-11-25 19:12:28.149 187216 INFO nova.compute.claims [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:12:28 compute-0 nova_compute[187212]: 2025-11-25 19:12:28.666 187216 INFO nova.compute.resource_tracker [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Updating resource usage from migration 79d8a4e8-5c62-40bb-91c8-6e3dd1091c5a
Nov 25 19:12:28 compute-0 nova_compute[187212]: 2025-11-25 19:12:28.667 187216 DEBUG nova.compute.resource_tracker [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Starting to track incoming migration 79d8a4e8-5c62-40bb-91c8-6e3dd1091c5a with flavor 8101b76b-3714-46ca-8790-3538f1950a51 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Nov 25 19:12:29 compute-0 podman[211059]: 2025-11-25 19:12:29.161457938 +0000 UTC m=+0.085692633 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:12:29 compute-0 nova_compute[187212]: 2025-11-25 19:12:29.231 187216 DEBUG nova.compute.provider_tree [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:12:29 compute-0 nova_compute[187212]: 2025-11-25 19:12:29.740 187216 DEBUG nova.scheduler.client.report [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:12:29 compute-0 podman[197585]: time="2025-11-25T19:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:12:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:12:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3045 "" "Go-http-client/1.1"
Nov 25 19:12:30 compute-0 nova_compute[187212]: 2025-11-25 19:12:30.258 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 2.634s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:30 compute-0 nova_compute[187212]: 2025-11-25 19:12:30.259 187216 INFO nova.compute.manager [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Migrating
Nov 25 19:12:30 compute-0 nova_compute[187212]: 2025-11-25 19:12:30.259 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:12:30 compute-0 nova_compute[187212]: 2025-11-25 19:12:30.259 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:12:30 compute-0 nova_compute[187212]: 2025-11-25 19:12:30.795 187216 INFO nova.compute.rpcapi [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Nov 25 19:12:30 compute-0 nova_compute[187212]: 2025-11-25 19:12:30.796 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:12:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:31.077 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:31.078 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:31.079 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:31 compute-0 nova_compute[187212]: 2025-11-25 19:12:31.081 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:31 compute-0 openstack_network_exporter[199731]: ERROR   19:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:12:31 compute-0 openstack_network_exporter[199731]: ERROR   19:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:12:31 compute-0 openstack_network_exporter[199731]: ERROR   19:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:12:31 compute-0 openstack_network_exporter[199731]: ERROR   19:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:12:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:12:31 compute-0 openstack_network_exporter[199731]: ERROR   19:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:12:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:12:31 compute-0 nova_compute[187212]: 2025-11-25 19:12:31.761 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:34 compute-0 sshd-session[211080]: Accepted publickey for nova from 192.168.122.101 port 51058 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:12:34 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 19:12:34 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 19:12:34 compute-0 systemd-logind[820]: New session 28 of user nova.
Nov 25 19:12:34 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 19:12:34 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 25 19:12:34 compute-0 systemd[211102]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:12:34 compute-0 podman[211082]: 2025-11-25 19:12:34.880982594 +0000 UTC m=+0.115717573 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Nov 25 19:12:35 compute-0 systemd[211102]: Queued start job for default target Main User Target.
Nov 25 19:12:35 compute-0 systemd[211102]: Created slice User Application Slice.
Nov 25 19:12:35 compute-0 systemd[211102]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 19:12:35 compute-0 systemd[211102]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 19:12:35 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 25 19:12:35 compute-0 systemd[211102]: Reached target Paths.
Nov 25 19:12:35 compute-0 systemd[211102]: Reached target Timers.
Nov 25 19:12:35 compute-0 systemd[211102]: Starting D-Bus User Message Bus Socket...
Nov 25 19:12:35 compute-0 systemd[211102]: Starting Create User's Volatile Files and Directories...
Nov 25 19:12:35 compute-0 systemd[211102]: Listening on D-Bus User Message Bus Socket.
Nov 25 19:12:35 compute-0 systemd[211102]: Reached target Sockets.
Nov 25 19:12:35 compute-0 systemd[211102]: Finished Create User's Volatile Files and Directories.
Nov 25 19:12:35 compute-0 systemd[211102]: Reached target Basic System.
Nov 25 19:12:35 compute-0 systemd[211102]: Reached target Main User Target.
Nov 25 19:12:35 compute-0 systemd[211102]: Startup finished in 202ms.
Nov 25 19:12:35 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 25 19:12:35 compute-0 systemd[1]: Started Session 28 of User nova.
Nov 25 19:12:35 compute-0 sshd-session[211080]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:12:35 compute-0 sshd-session[211122]: Received disconnect from 192.168.122.101 port 51058:11: disconnected by user
Nov 25 19:12:35 compute-0 sshd-session[211122]: Disconnected from user nova 192.168.122.101 port 51058
Nov 25 19:12:35 compute-0 sshd-session[211080]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:12:35 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Nov 25 19:12:35 compute-0 systemd-logind[820]: Session 28 logged out. Waiting for processes to exit.
Nov 25 19:12:35 compute-0 systemd-logind[820]: Removed session 28.
Nov 25 19:12:35 compute-0 sshd-session[211124]: Accepted publickey for nova from 192.168.122.101 port 51074 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:12:35 compute-0 systemd-logind[820]: New session 30 of user nova.
Nov 25 19:12:35 compute-0 systemd[1]: Started Session 30 of User nova.
Nov 25 19:12:35 compute-0 sshd-session[211124]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:12:35 compute-0 sshd-session[211127]: Received disconnect from 192.168.122.101 port 51074:11: disconnected by user
Nov 25 19:12:35 compute-0 sshd-session[211127]: Disconnected from user nova 192.168.122.101 port 51074
Nov 25 19:12:35 compute-0 sshd-session[211124]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:12:35 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Nov 25 19:12:35 compute-0 systemd-logind[820]: Session 30 logged out. Waiting for processes to exit.
Nov 25 19:12:35 compute-0 systemd-logind[820]: Removed session 30.
Nov 25 19:12:36 compute-0 nova_compute[187212]: 2025-11-25 19:12:36.083 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:36 compute-0 nova_compute[187212]: 2025-11-25 19:12:36.764 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:38 compute-0 nova_compute[187212]: 2025-11-25 19:12:38.214 187216 DEBUG nova.compute.manager [req-b92488b0-b14b-4c57-ada9-45627c7d7b00 req-8294b3a9-17d6-4ca2-8055-6dc281b2f388 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:12:38 compute-0 nova_compute[187212]: 2025-11-25 19:12:38.215 187216 DEBUG oslo_concurrency.lockutils [req-b92488b0-b14b-4c57-ada9-45627c7d7b00 req-8294b3a9-17d6-4ca2-8055-6dc281b2f388 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:38 compute-0 nova_compute[187212]: 2025-11-25 19:12:38.216 187216 DEBUG oslo_concurrency.lockutils [req-b92488b0-b14b-4c57-ada9-45627c7d7b00 req-8294b3a9-17d6-4ca2-8055-6dc281b2f388 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:38 compute-0 nova_compute[187212]: 2025-11-25 19:12:38.216 187216 DEBUG oslo_concurrency.lockutils [req-b92488b0-b14b-4c57-ada9-45627c7d7b00 req-8294b3a9-17d6-4ca2-8055-6dc281b2f388 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:38 compute-0 nova_compute[187212]: 2025-11-25 19:12:38.217 187216 DEBUG nova.compute.manager [req-b92488b0-b14b-4c57-ada9-45627c7d7b00 req-8294b3a9-17d6-4ca2-8055-6dc281b2f388 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] No waiting events found dispatching network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:12:38 compute-0 nova_compute[187212]: 2025-11-25 19:12:38.217 187216 WARNING nova.compute.manager [req-b92488b0-b14b-4c57-ada9-45627c7d7b00 req-8294b3a9-17d6-4ca2-8055-6dc281b2f388 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received unexpected event network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 for instance with vm_state active and task_state resize_migrating.
Nov 25 19:12:38 compute-0 sshd-session[211129]: Accepted publickey for nova from 192.168.122.101 port 51080 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:12:38 compute-0 systemd-logind[820]: New session 31 of user nova.
Nov 25 19:12:38 compute-0 systemd[1]: Started Session 31 of User nova.
Nov 25 19:12:38 compute-0 sshd-session[211129]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:12:38 compute-0 podman[211131]: 2025-11-25 19:12:38.983237183 +0000 UTC m=+0.117453369 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Nov 25 19:12:39 compute-0 sshd-session[211142]: Received disconnect from 192.168.122.101 port 51080:11: disconnected by user
Nov 25 19:12:39 compute-0 sshd-session[211142]: Disconnected from user nova 192.168.122.101 port 51080
Nov 25 19:12:39 compute-0 sshd-session[211129]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:12:39 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Nov 25 19:12:39 compute-0 systemd-logind[820]: Session 31 logged out. Waiting for processes to exit.
Nov 25 19:12:39 compute-0 systemd-logind[820]: Removed session 31.
Nov 25 19:12:39 compute-0 sshd-session[211156]: Accepted publickey for nova from 192.168.122.101 port 51086 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:12:39 compute-0 systemd-logind[820]: New session 32 of user nova.
Nov 25 19:12:39 compute-0 systemd[1]: Started Session 32 of User nova.
Nov 25 19:12:39 compute-0 sshd-session[211156]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:12:39 compute-0 sshd-session[211159]: Received disconnect from 192.168.122.101 port 51086:11: disconnected by user
Nov 25 19:12:39 compute-0 sshd-session[211159]: Disconnected from user nova 192.168.122.101 port 51086
Nov 25 19:12:39 compute-0 sshd-session[211156]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:12:39 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Nov 25 19:12:39 compute-0 systemd-logind[820]: Session 32 logged out. Waiting for processes to exit.
Nov 25 19:12:39 compute-0 systemd-logind[820]: Removed session 32.
Nov 25 19:12:39 compute-0 sshd-session[211161]: Accepted publickey for nova from 192.168.122.101 port 51100 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:12:39 compute-0 systemd-logind[820]: New session 33 of user nova.
Nov 25 19:12:39 compute-0 systemd[1]: Started Session 33 of User nova.
Nov 25 19:12:39 compute-0 sshd-session[211161]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:12:40 compute-0 sshd-session[211164]: Received disconnect from 192.168.122.101 port 51100:11: disconnected by user
Nov 25 19:12:40 compute-0 sshd-session[211164]: Disconnected from user nova 192.168.122.101 port 51100
Nov 25 19:12:40 compute-0 sshd-session[211161]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:12:40 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Nov 25 19:12:40 compute-0 systemd-logind[820]: Session 33 logged out. Waiting for processes to exit.
Nov 25 19:12:40 compute-0 systemd-logind[820]: Removed session 33.
Nov 25 19:12:40 compute-0 nova_compute[187212]: 2025-11-25 19:12:40.303 187216 DEBUG nova.compute.manager [req-8d3d6f8f-6c63-43b9-8f85-482d03b95b84 req-b16aca39-4a8b-4696-8a4e-7f1ed0395b7f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:12:40 compute-0 nova_compute[187212]: 2025-11-25 19:12:40.304 187216 DEBUG oslo_concurrency.lockutils [req-8d3d6f8f-6c63-43b9-8f85-482d03b95b84 req-b16aca39-4a8b-4696-8a4e-7f1ed0395b7f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:40 compute-0 nova_compute[187212]: 2025-11-25 19:12:40.304 187216 DEBUG oslo_concurrency.lockutils [req-8d3d6f8f-6c63-43b9-8f85-482d03b95b84 req-b16aca39-4a8b-4696-8a4e-7f1ed0395b7f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:40 compute-0 nova_compute[187212]: 2025-11-25 19:12:40.304 187216 DEBUG oslo_concurrency.lockutils [req-8d3d6f8f-6c63-43b9-8f85-482d03b95b84 req-b16aca39-4a8b-4696-8a4e-7f1ed0395b7f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:40 compute-0 nova_compute[187212]: 2025-11-25 19:12:40.305 187216 DEBUG nova.compute.manager [req-8d3d6f8f-6c63-43b9-8f85-482d03b95b84 req-b16aca39-4a8b-4696-8a4e-7f1ed0395b7f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] No waiting events found dispatching network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:12:40 compute-0 nova_compute[187212]: 2025-11-25 19:12:40.305 187216 WARNING nova.compute.manager [req-8d3d6f8f-6c63-43b9-8f85-482d03b95b84 req-b16aca39-4a8b-4696-8a4e-7f1ed0395b7f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received unexpected event network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 for instance with vm_state active and task_state resize_migrating.
Nov 25 19:12:41 compute-0 nova_compute[187212]: 2025-11-25 19:12:41.086 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:41 compute-0 nova_compute[187212]: 2025-11-25 19:12:41.767 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:42 compute-0 nova_compute[187212]: 2025-11-25 19:12:42.799 187216 WARNING neutronclient.v2_0.client [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:43 compute-0 nova_compute[187212]: 2025-11-25 19:12:43.002 187216 INFO nova.network.neutron [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Updating port b74c368f-baf3-47d1-9cfb-df249446cbb3 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 25 19:12:43 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:43.293 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:12:43 compute-0 nova_compute[187212]: 2025-11-25 19:12:43.294 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:43 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:43.295 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:12:43 compute-0 nova_compute[187212]: 2025-11-25 19:12:43.492 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-dd2a5303-3518-4f79-aa7b-45fc96059d01" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:12:43 compute-0 nova_compute[187212]: 2025-11-25 19:12:43.493 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-dd2a5303-3518-4f79-aa7b-45fc96059d01" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:12:43 compute-0 nova_compute[187212]: 2025-11-25 19:12:43.494 187216 DEBUG nova.network.neutron [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:12:43 compute-0 nova_compute[187212]: 2025-11-25 19:12:43.579 187216 DEBUG nova.compute.manager [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-changed-b74c368f-baf3-47d1-9cfb-df249446cbb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:12:43 compute-0 nova_compute[187212]: 2025-11-25 19:12:43.579 187216 DEBUG nova.compute.manager [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Refreshing instance network info cache due to event network-changed-b74c368f-baf3-47d1-9cfb-df249446cbb3. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:12:43 compute-0 nova_compute[187212]: 2025-11-25 19:12:43.580 187216 DEBUG oslo_concurrency.lockutils [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-dd2a5303-3518-4f79-aa7b-45fc96059d01" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:12:44 compute-0 nova_compute[187212]: 2025-11-25 19:12:44.001 187216 WARNING neutronclient.v2_0.client [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:44 compute-0 nova_compute[187212]: 2025-11-25 19:12:44.803 187216 WARNING neutronclient.v2_0.client [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:45 compute-0 nova_compute[187212]: 2025-11-25 19:12:45.067 187216 DEBUG nova.network.neutron [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Updating instance_info_cache with network_info: [{"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:12:45 compute-0 nova_compute[187212]: 2025-11-25 19:12:45.575 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-dd2a5303-3518-4f79-aa7b-45fc96059d01" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:12:45 compute-0 nova_compute[187212]: 2025-11-25 19:12:45.581 187216 DEBUG oslo_concurrency.lockutils [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-dd2a5303-3518-4f79-aa7b-45fc96059d01" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:12:45 compute-0 nova_compute[187212]: 2025-11-25 19:12:45.582 187216 DEBUG nova.network.neutron [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Refreshing network info cache for port b74c368f-baf3-47d1-9cfb-df249446cbb3 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.088 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.140 187216 WARNING neutronclient.v2_0.client [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.168 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.171 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.171 187216 INFO nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Creating image(s)
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.173 187216 DEBUG oslo_concurrency.processutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.264 187216 DEBUG oslo_concurrency.processutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.266 187216 DEBUG nova.virt.disk.api [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Checking if we can resize image /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.266 187216 DEBUG oslo_concurrency.processutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.334 187216 DEBUG oslo_concurrency.processutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.335 187216 DEBUG nova.virt.disk.api [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Cannot resize image /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.594 187216 WARNING neutronclient.v2_0.client [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.769 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.846 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.847 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Ensure instance console log exists: /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.848 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.849 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.849 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.854 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Start _get_guest_xml network_info=[{"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:d5:ad:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.863 187216 WARNING nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.865 187216 DEBUG nova.virt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-122760017', uuid='dd2a5303-3518-4f79-aa7b-45fc96059d01'), owner=OwnerMeta(userid='7c561073d7c34a029574a6e2fb952944', username='tempest-TestExecuteActionsViaActuator-1103022868-project-admin', projectid='780511b4bf4d49299cc4d9b324261841', projectname='tempest-TestExecuteActionsViaActuator-1103022868'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.micro', flavorid='8101b76b-3714-46ca-8790-3538f1950a51', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:d5:ad:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764097966.86553) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.875 187216 DEBUG nova.virt.libvirt.host [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.876 187216 DEBUG nova.virt.libvirt.host [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.881 187216 DEBUG nova.virt.libvirt.host [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.881 187216 DEBUG nova.virt.libvirt.host [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.883 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.884 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8101b76b-3714-46ca-8790-3538f1950a51',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.884 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.885 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.885 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.885 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.885 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.886 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.886 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.886 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.887 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.887 187216 DEBUG nova.virt.hardware [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.892 187216 DEBUG oslo_concurrency.processutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.979 187216 DEBUG oslo_concurrency.processutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk.config --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.981 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "/var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.981 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "/var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.984 187216 DEBUG oslo_concurrency.lockutils [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "/var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.987 187216 DEBUG nova.virt.libvirt.vif [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:11:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-122760017',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-122760017',id=4,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:11:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-6sh6lchw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:12:40Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=dd2a5303-3518-4f79-aa7b-45fc96059d01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:d5:ad:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.987 187216 DEBUG nova.network.os_vif_util [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:d5:ad:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.989 187216 DEBUG nova.network.os_vif_util [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:ad:b9,bridge_name='br-int',has_traffic_filtering=True,id=b74c368f-baf3-47d1-9cfb-df249446cbb3,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb74c368f-ba') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.994 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <uuid>dd2a5303-3518-4f79-aa7b-45fc96059d01</uuid>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <name>instance-00000004</name>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <memory>196608</memory>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-122760017</nova:name>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:12:46</nova:creationTime>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <nova:flavor name="m1.micro" id="8101b76b-3714-46ca-8790-3538f1950a51">
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:memory>192</nova:memory>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:property name="hw_input_bus">usb</nova:property>
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:property name="hw_machine_type">q35</nova:property>
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:property name="hw_video_model">virtio</nova:property>
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:property name="hw_vif_model">virtio</nova:property>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:user uuid="7c561073d7c34a029574a6e2fb952944">tempest-TestExecuteActionsViaActuator-1103022868-project-admin</nova:user>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:project uuid="780511b4bf4d49299cc4d9b324261841">tempest-TestExecuteActionsViaActuator-1103022868</nova:project>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         <nova:port uuid="b74c368f-baf3-47d1-9cfb-df249446cbb3">
Nov 25 19:12:46 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <system>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <entry name="serial">dd2a5303-3518-4f79-aa7b-45fc96059d01</entry>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <entry name="uuid">dd2a5303-3518-4f79-aa7b-45fc96059d01</entry>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     </system>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <os>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   </os>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <features>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   </features>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk.config"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:d5:ad:b9"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <target dev="tapb74c368f-ba"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/console.log" append="off"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <video>
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     </video>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:12:46 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:12:46 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:12:46 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:12:46 compute-0 nova_compute[187212]: </domain>
Nov 25 19:12:46 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.997 187216 DEBUG nova.virt.libvirt.vif [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:11:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-122760017',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-122760017',id=4,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:11:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-6sh6lchw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:12:40Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=dd2a5303-3518-4f79-aa7b-45fc96059d01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:d5:ad:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:12:46 compute-0 nova_compute[187212]: 2025-11-25 19:12:46.998 187216 DEBUG nova.network.os_vif_util [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:d5:ad:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.000 187216 DEBUG nova.network.os_vif_util [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:ad:b9,bridge_name='br-int',has_traffic_filtering=True,id=b74c368f-baf3-47d1-9cfb-df249446cbb3,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb74c368f-ba') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.001 187216 DEBUG os_vif [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:ad:b9,bridge_name='br-int',has_traffic_filtering=True,id=b74c368f-baf3-47d1-9cfb-df249446cbb3,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb74c368f-ba') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.002 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.003 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.004 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.006 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.006 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '83bdadba-7de2-5524-b5f0-021fb438577a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.009 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.014 187216 DEBUG nova.network.neutron [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Updated VIF entry in instance network info cache for port b74c368f-baf3-47d1-9cfb-df249446cbb3. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.015 187216 DEBUG nova.network.neutron [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Updating instance_info_cache with network_info: [{"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.019 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.020 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb74c368f-ba, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.020 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb74c368f-ba, col_values=(('qos', UUID('3f95acd4-5d09-461c-8dc8-a16d8cecacd0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.021 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb74c368f-ba, col_values=(('external_ids', {'iface-id': 'b74c368f-baf3-47d1-9cfb-df249446cbb3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:ad:b9', 'vm-uuid': 'dd2a5303-3518-4f79-aa7b-45fc96059d01'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.022 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:47 compute-0 NetworkManager[55552]: <info>  [1764097967.0244] manager: (tapb74c368f-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.025 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.032 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.033 187216 INFO os_vif [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:ad:b9,bridge_name='br-int',has_traffic_filtering=True,id=b74c368f-baf3-47d1-9cfb-df249446cbb3,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb74c368f-ba')
Nov 25 19:12:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:47.298 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:47 compute-0 nova_compute[187212]: 2025-11-25 19:12:47.528 187216 DEBUG oslo_concurrency.lockutils [req-e42886c6-8cc9-4130-8df0-044471c3901b req-fec3faab-3dd1-4382-b666-97f15890000a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-dd2a5303-3518-4f79-aa7b-45fc96059d01" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:12:48 compute-0 nova_compute[187212]: 2025-11-25 19:12:48.589 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:12:48 compute-0 nova_compute[187212]: 2025-11-25 19:12:48.589 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:12:48 compute-0 nova_compute[187212]: 2025-11-25 19:12:48.590 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No VIF found with MAC fa:16:3e:d5:ad:b9, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:12:48 compute-0 nova_compute[187212]: 2025-11-25 19:12:48.590 187216 INFO nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Using config drive
Nov 25 19:12:48 compute-0 kernel: tapb74c368f-ba: entered promiscuous mode
Nov 25 19:12:48 compute-0 NetworkManager[55552]: <info>  [1764097968.6655] manager: (tapb74c368f-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Nov 25 19:12:48 compute-0 ovn_controller[95465]: 2025-11-25T19:12:48Z|00054|binding|INFO|Claiming lport b74c368f-baf3-47d1-9cfb-df249446cbb3 for this chassis.
Nov 25 19:12:48 compute-0 ovn_controller[95465]: 2025-11-25T19:12:48Z|00055|binding|INFO|b74c368f-baf3-47d1-9cfb-df249446cbb3: Claiming fa:16:3e:d5:ad:b9 10.100.0.6
Nov 25 19:12:48 compute-0 nova_compute[187212]: 2025-11-25 19:12:48.668 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.678 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:ad:b9 10.100.0.6'], port_security=['fa:16:3e:d5:ad:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'dd2a5303-3518-4f79-aa7b-45fc96059d01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=b74c368f-baf3-47d1-9cfb-df249446cbb3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.679 104356 INFO neutron.agent.ovn.metadata.agent [-] Port b74c368f-baf3-47d1-9cfb-df249446cbb3 in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 bound to our chassis
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.682 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:12:48 compute-0 ovn_controller[95465]: 2025-11-25T19:12:48Z|00056|binding|INFO|Setting lport b74c368f-baf3-47d1-9cfb-df249446cbb3 ovn-installed in OVS
Nov 25 19:12:48 compute-0 ovn_controller[95465]: 2025-11-25T19:12:48Z|00057|binding|INFO|Setting lport b74c368f-baf3-47d1-9cfb-df249446cbb3 up in Southbound
Nov 25 19:12:48 compute-0 nova_compute[187212]: 2025-11-25 19:12:48.699 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.711 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[59088f6b-0710-4b10-9a34-6c60aceed303]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:48 compute-0 systemd-udevd[211192]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:12:48 compute-0 systemd-machined[153494]: New machine qemu-3-instance-00000004.
Nov 25 19:12:48 compute-0 NetworkManager[55552]: <info>  [1764097968.7409] device (tapb74c368f-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:12:48 compute-0 NetworkManager[55552]: <info>  [1764097968.7429] device (tapb74c368f-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:12:48 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.765 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd510b4-62ea-4ba0-bc78-fa4da5a8fed5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.769 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[e43a5c10-d095-4a7a-92bd-2aa71b99eaef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.817 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[e339cd62-11b0-4dc8-a34a-59bd3a188803]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.843 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[159ed985-b19b-4f99-b933-96f8bc5ff314]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 44288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211206, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.869 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1a276434-9332-402f-a4f0-c7883d8bf383]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211208, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211208, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.870 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:48 compute-0 nova_compute[187212]: 2025-11-25 19:12:48.906 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:48 compute-0 nova_compute[187212]: 2025-11-25 19:12:48.908 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.909 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.909 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.909 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.910 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:12:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:12:48.913 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[eff0a110-0db3-4d35-9659-f4c042691379]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:12:49 compute-0 nova_compute[187212]: 2025-11-25 19:12:49.233 187216 DEBUG nova.compute.manager [req-4ec9541a-491e-4bcf-bc34-273ecd93c10b req-f3fafee2-da39-4a8b-8b84-97688950640c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-vif-plugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:12:49 compute-0 nova_compute[187212]: 2025-11-25 19:12:49.234 187216 DEBUG oslo_concurrency.lockutils [req-4ec9541a-491e-4bcf-bc34-273ecd93c10b req-f3fafee2-da39-4a8b-8b84-97688950640c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:49 compute-0 nova_compute[187212]: 2025-11-25 19:12:49.234 187216 DEBUG oslo_concurrency.lockutils [req-4ec9541a-491e-4bcf-bc34-273ecd93c10b req-f3fafee2-da39-4a8b-8b84-97688950640c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:49 compute-0 nova_compute[187212]: 2025-11-25 19:12:49.234 187216 DEBUG oslo_concurrency.lockutils [req-4ec9541a-491e-4bcf-bc34-273ecd93c10b req-f3fafee2-da39-4a8b-8b84-97688950640c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:49 compute-0 nova_compute[187212]: 2025-11-25 19:12:49.235 187216 DEBUG nova.compute.manager [req-4ec9541a-491e-4bcf-bc34-273ecd93c10b req-f3fafee2-da39-4a8b-8b84-97688950640c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] No waiting events found dispatching network-vif-plugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:12:49 compute-0 nova_compute[187212]: 2025-11-25 19:12:49.235 187216 WARNING nova.compute.manager [req-4ec9541a-491e-4bcf-bc34-273ecd93c10b req-f3fafee2-da39-4a8b-8b84-97688950640c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received unexpected event network-vif-plugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 for instance with vm_state active and task_state resize_finish.
Nov 25 19:12:50 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 19:12:50 compute-0 systemd[211102]: Activating special unit Exit the Session...
Nov 25 19:12:50 compute-0 systemd[211102]: Stopped target Main User Target.
Nov 25 19:12:50 compute-0 systemd[211102]: Stopped target Basic System.
Nov 25 19:12:50 compute-0 systemd[211102]: Stopped target Paths.
Nov 25 19:12:50 compute-0 systemd[211102]: Stopped target Sockets.
Nov 25 19:12:50 compute-0 systemd[211102]: Stopped target Timers.
Nov 25 19:12:50 compute-0 systemd[211102]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 19:12:50 compute-0 systemd[211102]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 19:12:50 compute-0 systemd[211102]: Closed D-Bus User Message Bus Socket.
Nov 25 19:12:50 compute-0 systemd[211102]: Stopped Create User's Volatile Files and Directories.
Nov 25 19:12:50 compute-0 systemd[211102]: Removed slice User Application Slice.
Nov 25 19:12:50 compute-0 systemd[211102]: Reached target Shutdown.
Nov 25 19:12:50 compute-0 systemd[211102]: Finished Exit the Session.
Nov 25 19:12:50 compute-0 systemd[211102]: Reached target Exit the Session.
Nov 25 19:12:50 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 19:12:50 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 19:12:50 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 19:12:50 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 19:12:50 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 19:12:50 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 19:12:50 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 19:12:50 compute-0 podman[211209]: 2025-11-25 19:12:50.166615086 +0000 UTC m=+0.086546316 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:12:50 compute-0 nova_compute[187212]: 2025-11-25 19:12:50.220 187216 DEBUG nova.compute.manager [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:12:50 compute-0 nova_compute[187212]: 2025-11-25 19:12:50.224 187216 INFO nova.virt.libvirt.driver [-] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Instance running successfully.
Nov 25 19:12:50 compute-0 virtqemud[186888]: argument unsupported: QEMU guest agent is not configured
Nov 25 19:12:50 compute-0 nova_compute[187212]: 2025-11-25 19:12:50.227 187216 DEBUG nova.virt.libvirt.guest [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Nov 25 19:12:50 compute-0 nova_compute[187212]: 2025-11-25 19:12:50.228 187216 DEBUG nova.virt.libvirt.driver [None req-b83ea631-1ac2-4ae4-a39b-81e35aed5420 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Nov 25 19:12:51 compute-0 nova_compute[187212]: 2025-11-25 19:12:51.316 187216 DEBUG nova.compute.manager [req-8ee535ed-9c20-4e49-aff7-efeadaeb9f44 req-fa2acbfa-c1b1-41e2-a536-a794a2bbb900 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-vif-plugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:12:51 compute-0 nova_compute[187212]: 2025-11-25 19:12:51.317 187216 DEBUG oslo_concurrency.lockutils [req-8ee535ed-9c20-4e49-aff7-efeadaeb9f44 req-fa2acbfa-c1b1-41e2-a536-a794a2bbb900 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:12:51 compute-0 nova_compute[187212]: 2025-11-25 19:12:51.317 187216 DEBUG oslo_concurrency.lockutils [req-8ee535ed-9c20-4e49-aff7-efeadaeb9f44 req-fa2acbfa-c1b1-41e2-a536-a794a2bbb900 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:12:51 compute-0 nova_compute[187212]: 2025-11-25 19:12:51.318 187216 DEBUG oslo_concurrency.lockutils [req-8ee535ed-9c20-4e49-aff7-efeadaeb9f44 req-fa2acbfa-c1b1-41e2-a536-a794a2bbb900 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:12:51 compute-0 nova_compute[187212]: 2025-11-25 19:12:51.318 187216 DEBUG nova.compute.manager [req-8ee535ed-9c20-4e49-aff7-efeadaeb9f44 req-fa2acbfa-c1b1-41e2-a536-a794a2bbb900 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] No waiting events found dispatching network-vif-plugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:12:51 compute-0 nova_compute[187212]: 2025-11-25 19:12:51.319 187216 WARNING nova.compute.manager [req-8ee535ed-9c20-4e49-aff7-efeadaeb9f44 req-fa2acbfa-c1b1-41e2-a536-a794a2bbb900 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received unexpected event network-vif-plugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 for instance with vm_state resized and task_state None.
Nov 25 19:12:51 compute-0 nova_compute[187212]: 2025-11-25 19:12:51.771 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:52 compute-0 nova_compute[187212]: 2025-11-25 19:12:52.023 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:56 compute-0 nova_compute[187212]: 2025-11-25 19:12:56.774 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:57 compute-0 nova_compute[187212]: 2025-11-25 19:12:57.028 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:12:57 compute-0 podman[211246]: 2025-11-25 19:12:57.214020723 +0000 UTC m=+0.128964480 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 19:12:59 compute-0 podman[197585]: time="2025-11-25T19:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:12:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:12:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Nov 25 19:13:00 compute-0 podman[211272]: 2025-11-25 19:13:00.18814858 +0000 UTC m=+0.102460285 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 25 19:13:01 compute-0 openstack_network_exporter[199731]: ERROR   19:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:13:01 compute-0 openstack_network_exporter[199731]: ERROR   19:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:13:01 compute-0 openstack_network_exporter[199731]: ERROR   19:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:13:01 compute-0 openstack_network_exporter[199731]: ERROR   19:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:13:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:13:01 compute-0 openstack_network_exporter[199731]: ERROR   19:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:13:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:13:01 compute-0 nova_compute[187212]: 2025-11-25 19:13:01.835 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:02 compute-0 nova_compute[187212]: 2025-11-25 19:13:02.031 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:02 compute-0 ovn_controller[95465]: 2025-11-25T19:13:02Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d5:ad:b9 10.100.0.6
Nov 25 19:13:05 compute-0 podman[211298]: 2025-11-25 19:13:05.178092095 +0000 UTC m=+0.099570347 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 19:13:06 compute-0 nova_compute[187212]: 2025-11-25 19:13:06.837 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:07 compute-0 nova_compute[187212]: 2025-11-25 19:13:07.034 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:09 compute-0 podman[211320]: 2025-11-25 19:13:09.184057695 +0000 UTC m=+0.102197408 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:13:11 compute-0 nova_compute[187212]: 2025-11-25 19:13:11.838 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:12 compute-0 nova_compute[187212]: 2025-11-25 19:13:12.036 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:14 compute-0 nova_compute[187212]: 2025-11-25 19:13:14.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:13:14 compute-0 nova_compute[187212]: 2025-11-25 19:13:14.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:13:16 compute-0 nova_compute[187212]: 2025-11-25 19:13:16.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:13:16 compute-0 nova_compute[187212]: 2025-11-25 19:13:16.841 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:17 compute-0 nova_compute[187212]: 2025-11-25 19:13:17.038 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:18 compute-0 nova_compute[187212]: 2025-11-25 19:13:18.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:13:18 compute-0 nova_compute[187212]: 2025-11-25 19:13:18.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:13:19 compute-0 nova_compute[187212]: 2025-11-25 19:13:19.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:13:19 compute-0 nova_compute[187212]: 2025-11-25 19:13:19.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:19 compute-0 nova_compute[187212]: 2025-11-25 19:13:19.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:19 compute-0 nova_compute[187212]: 2025-11-25 19:13:19.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:19 compute-0 nova_compute[187212]: 2025-11-25 19:13:19.690 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:13:20 compute-0 nova_compute[187212]: 2025-11-25 19:13:20.755 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:20 compute-0 nova_compute[187212]: 2025-11-25 19:13:20.848 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:20 compute-0 nova_compute[187212]: 2025-11-25 19:13:20.849 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:20 compute-0 nova_compute[187212]: 2025-11-25 19:13:20.916 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:20 compute-0 nova_compute[187212]: 2025-11-25 19:13:20.924 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:20 compute-0 nova_compute[187212]: 2025-11-25 19:13:20.981 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:20 compute-0 nova_compute[187212]: 2025-11-25 19:13:20.982 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.087 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:21 compute-0 podman[211352]: 2025-11-25 19:13:21.183332307 +0000 UTC m=+0.099262108 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.342 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.345 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.378 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.379 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5498MB free_disk=72.93967819213867GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.380 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.380 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.425 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "6fe2a300-76bb-44b4-8828-f87977451114" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.425 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.843 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:21 compute-0 nova_compute[187212]: 2025-11-25 19:13:21.933 187216 DEBUG nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:13:22 compute-0 nova_compute[187212]: 2025-11-25 19:13:22.041 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:22 compute-0 nova_compute[187212]: 2025-11-25 19:13:22.445 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 909b423a-9e57-4bb8-b6b5-719b05724d71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:13:22 compute-0 nova_compute[187212]: 2025-11-25 19:13:22.445 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance dd2a5303-3518-4f79-aa7b-45fc96059d01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:13:22 compute-0 nova_compute[187212]: 2025-11-25 19:13:22.477 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:22 compute-0 nova_compute[187212]: 2025-11-25 19:13:22.952 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 6fe2a300-76bb-44b4-8828-f87977451114 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1797
Nov 25 19:13:22 compute-0 nova_compute[187212]: 2025-11-25 19:13:22.952 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:13:22 compute-0 nova_compute[187212]: 2025-11-25 19:13:22.952 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:13:21 up  1:05,  0 user,  load average: 0.58, 0.42, 0.47\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_780511b4bf4d49299cc4d9b324261841': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:13:23 compute-0 nova_compute[187212]: 2025-11-25 19:13:23.066 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:13:23 compute-0 nova_compute[187212]: 2025-11-25 19:13:23.578 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:13:24 compute-0 nova_compute[187212]: 2025-11-25 19:13:24.118 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:13:24 compute-0 nova_compute[187212]: 2025-11-25 19:13:24.119 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.739s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:24 compute-0 nova_compute[187212]: 2025-11-25 19:13:24.119 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.642s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:24 compute-0 nova_compute[187212]: 2025-11-25 19:13:24.127 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:13:24 compute-0 nova_compute[187212]: 2025-11-25 19:13:24.127 187216 INFO nova.compute.claims [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:13:25 compute-0 nova_compute[187212]: 2025-11-25 19:13:25.121 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:13:25 compute-0 nova_compute[187212]: 2025-11-25 19:13:25.124 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:13:25 compute-0 nova_compute[187212]: 2025-11-25 19:13:25.125 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:13:25 compute-0 nova_compute[187212]: 2025-11-25 19:13:25.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:13:25 compute-0 nova_compute[187212]: 2025-11-25 19:13:25.228 187216 DEBUG nova.compute.provider_tree [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:13:25 compute-0 nova_compute[187212]: 2025-11-25 19:13:25.736 187216 DEBUG nova.scheduler.client.report [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:13:26 compute-0 nova_compute[187212]: 2025-11-25 19:13:26.247 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.128s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:26 compute-0 nova_compute[187212]: 2025-11-25 19:13:26.248 187216 DEBUG nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:13:26 compute-0 nova_compute[187212]: 2025-11-25 19:13:26.765 187216 DEBUG nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:13:26 compute-0 nova_compute[187212]: 2025-11-25 19:13:26.766 187216 DEBUG nova.network.neutron [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:13:26 compute-0 nova_compute[187212]: 2025-11-25 19:13:26.767 187216 WARNING neutronclient.v2_0.client [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:13:26 compute-0 nova_compute[187212]: 2025-11-25 19:13:26.768 187216 WARNING neutronclient.v2_0.client [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:13:26 compute-0 nova_compute[187212]: 2025-11-25 19:13:26.845 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:27 compute-0 nova_compute[187212]: 2025-11-25 19:13:27.043 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:27 compute-0 nova_compute[187212]: 2025-11-25 19:13:27.276 187216 INFO nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:13:27 compute-0 nova_compute[187212]: 2025-11-25 19:13:27.375 187216 DEBUG nova.network.neutron [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Successfully created port: 5cceedef-39bc-43df-be34-b65a3f0dd6b1 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:13:27 compute-0 nova_compute[187212]: 2025-11-25 19:13:27.786 187216 DEBUG nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.072 187216 DEBUG nova.network.neutron [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Successfully updated port: 5cceedef-39bc-43df-be34-b65a3f0dd6b1 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.153 187216 DEBUG nova.compute.manager [req-a70fa602-7297-462f-bc83-90bb2a1343f4 req-c07ff0df-2a46-4ead-8e81-d2f83e1945f1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Received event network-changed-5cceedef-39bc-43df-be34-b65a3f0dd6b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.154 187216 DEBUG nova.compute.manager [req-a70fa602-7297-462f-bc83-90bb2a1343f4 req-c07ff0df-2a46-4ead-8e81-d2f83e1945f1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Refreshing instance network info cache due to event network-changed-5cceedef-39bc-43df-be34-b65a3f0dd6b1. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.154 187216 DEBUG oslo_concurrency.lockutils [req-a70fa602-7297-462f-bc83-90bb2a1343f4 req-c07ff0df-2a46-4ead-8e81-d2f83e1945f1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-6fe2a300-76bb-44b4-8828-f87977451114" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.154 187216 DEBUG oslo_concurrency.lockutils [req-a70fa602-7297-462f-bc83-90bb2a1343f4 req-c07ff0df-2a46-4ead-8e81-d2f83e1945f1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-6fe2a300-76bb-44b4-8828-f87977451114" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.155 187216 DEBUG nova.network.neutron [req-a70fa602-7297-462f-bc83-90bb2a1343f4 req-c07ff0df-2a46-4ead-8e81-d2f83e1945f1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Refreshing network info cache for port 5cceedef-39bc-43df-be34-b65a3f0dd6b1 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:13:28 compute-0 podman[211380]: 2025-11-25 19:13:28.224860504 +0000 UTC m=+0.135155887 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.579 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "refresh_cache-6fe2a300-76bb-44b4-8828-f87977451114" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.662 187216 WARNING neutronclient.v2_0.client [req-a70fa602-7297-462f-bc83-90bb2a1343f4 req-c07ff0df-2a46-4ead-8e81-d2f83e1945f1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.806 187216 DEBUG nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.808 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.809 187216 INFO nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Creating image(s)
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.810 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "/var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.811 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "/var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.812 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "/var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.813 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.820 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.823 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.910 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.911 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.913 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.913 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.920 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.921 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.994 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:28 compute-0 nova_compute[187212]: 2025-11-25 19:13:28.995 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.042 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.043 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.044 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.062 187216 DEBUG nova.network.neutron [req-a70fa602-7297-462f-bc83-90bb2a1343f4 req-c07ff0df-2a46-4ead-8e81-d2f83e1945f1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.131 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.132 187216 DEBUG nova.virt.disk.api [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Checking if we can resize image /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.133 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.194 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.195 187216 DEBUG nova.virt.disk.api [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Cannot resize image /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.195 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.196 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Ensure instance console log exists: /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.196 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.197 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.197 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.240 187216 DEBUG nova.network.neutron [req-a70fa602-7297-462f-bc83-90bb2a1343f4 req-c07ff0df-2a46-4ead-8e81-d2f83e1945f1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:13:29 compute-0 podman[197585]: time="2025-11-25T19:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.746 187216 DEBUG oslo_concurrency.lockutils [req-a70fa602-7297-462f-bc83-90bb2a1343f4 req-c07ff0df-2a46-4ead-8e81-d2f83e1945f1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-6fe2a300-76bb-44b4-8828-f87977451114" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.747 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquired lock "refresh_cache-6fe2a300-76bb-44b4-8828-f87977451114" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:13:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:13:29 compute-0 nova_compute[187212]: 2025-11-25 19:13:29.748 187216 DEBUG nova.network.neutron [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:13:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Nov 25 19:13:30 compute-0 nova_compute[187212]: 2025-11-25 19:13:30.483 187216 DEBUG nova.network.neutron [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:13:30 compute-0 nova_compute[187212]: 2025-11-25 19:13:30.764 187216 WARNING neutronclient.v2_0.client [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:13:30 compute-0 nova_compute[187212]: 2025-11-25 19:13:30.927 187216 DEBUG nova.network.neutron [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Updating instance_info_cache with network_info: [{"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:13:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:31.080 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:31.081 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:31.082 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:31 compute-0 podman[211422]: 2025-11-25 19:13:31.165596401 +0000 UTC m=+0.083031541 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 19:13:31 compute-0 openstack_network_exporter[199731]: ERROR   19:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:13:31 compute-0 openstack_network_exporter[199731]: ERROR   19:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:13:31 compute-0 openstack_network_exporter[199731]: ERROR   19:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:13:31 compute-0 openstack_network_exporter[199731]: ERROR   19:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:13:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:13:31 compute-0 openstack_network_exporter[199731]: ERROR   19:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:13:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.436 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Releasing lock "refresh_cache-6fe2a300-76bb-44b4-8828-f87977451114" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.438 187216 DEBUG nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Instance network_info: |[{"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.444 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Start _get_guest_xml network_info=[{"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.449 187216 WARNING nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.451 187216 DEBUG nova.virt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1631407178', uuid='6fe2a300-76bb-44b4-8828-f87977451114'), owner=OwnerMeta(userid='7c561073d7c34a029574a6e2fb952944', username='tempest-TestExecuteActionsViaActuator-1103022868-project-admin', projectid='780511b4bf4d49299cc4d9b324261841', projectname='tempest-TestExecuteActionsViaActuator-1103022868'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764098011.4511633) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.458 187216 DEBUG nova.virt.libvirt.host [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.458 187216 DEBUG nova.virt.libvirt.host [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.462 187216 DEBUG nova.virt.libvirt.host [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.463 187216 DEBUG nova.virt.libvirt.host [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.465 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.465 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.466 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.466 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.467 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.467 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.467 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.468 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.468 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.468 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.469 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.469 187216 DEBUG nova.virt.hardware [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.474 187216 DEBUG nova.virt.libvirt.vif [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:13:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1631407178',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1631407178',id=7,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-8kkhqkbd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:13:27Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=6fe2a300-76bb-44b4-8828-f87977451114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.475 187216 DEBUG nova.network.os_vif_util [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.476 187216 DEBUG nova.network.os_vif_util [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:b8:85,bridge_name='br-int',has_traffic_filtering=True,id=5cceedef-39bc-43df-be34-b65a3f0dd6b1,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cceedef-39') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.477 187216 DEBUG nova.objects.instance [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6fe2a300-76bb-44b4-8828-f87977451114 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.847 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.986 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <uuid>6fe2a300-76bb-44b4-8828-f87977451114</uuid>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <name>instance-00000007</name>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1631407178</nova:name>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:13:31</nova:creationTime>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:13:31 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:13:31 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:user uuid="7c561073d7c34a029574a6e2fb952944">tempest-TestExecuteActionsViaActuator-1103022868-project-admin</nova:user>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:project uuid="780511b4bf4d49299cc4d9b324261841">tempest-TestExecuteActionsViaActuator-1103022868</nova:project>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         <nova:port uuid="5cceedef-39bc-43df-be34-b65a3f0dd6b1">
Nov 25 19:13:31 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <system>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <entry name="serial">6fe2a300-76bb-44b4-8828-f87977451114</entry>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <entry name="uuid">6fe2a300-76bb-44b4-8828-f87977451114</entry>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     </system>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <os>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   </os>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <features>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   </features>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk.config"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:cb:b8:85"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <target dev="tap5cceedef-39"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/console.log" append="off"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <video>
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     </video>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:13:31 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:13:31 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:13:31 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:13:31 compute-0 nova_compute[187212]: </domain>
Nov 25 19:13:31 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.988 187216 DEBUG nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Preparing to wait for external event network-vif-plugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.988 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "6fe2a300-76bb-44b4-8828-f87977451114-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.989 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.990 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.991 187216 DEBUG nova.virt.libvirt.vif [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:13:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1631407178',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1631407178',id=7,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-8kkhqkbd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:13:27Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=6fe2a300-76bb-44b4-8828-f87977451114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.991 187216 DEBUG nova.network.os_vif_util [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.992 187216 DEBUG nova.network.os_vif_util [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:b8:85,bridge_name='br-int',has_traffic_filtering=True,id=5cceedef-39bc-43df-be34-b65a3f0dd6b1,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cceedef-39') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.993 187216 DEBUG os_vif [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:b8:85,bridge_name='br-int',has_traffic_filtering=True,id=5cceedef-39bc-43df-be34-b65a3f0dd6b1,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cceedef-39') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.994 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.994 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.995 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.996 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.997 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd13d4fb5-e522-5338-bcd6-90546291b9ca', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:13:31 compute-0 nova_compute[187212]: 2025-11-25 19:13:31.998 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:32 compute-0 nova_compute[187212]: 2025-11-25 19:13:32.000 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:32 compute-0 nova_compute[187212]: 2025-11-25 19:13:32.005 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:32 compute-0 nova_compute[187212]: 2025-11-25 19:13:32.005 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5cceedef-39, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:13:32 compute-0 nova_compute[187212]: 2025-11-25 19:13:32.006 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap5cceedef-39, col_values=(('qos', UUID('43723900-9c38-4267-91ea-0251fc489e32')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:13:32 compute-0 nova_compute[187212]: 2025-11-25 19:13:32.006 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap5cceedef-39, col_values=(('external_ids', {'iface-id': '5cceedef-39bc-43df-be34-b65a3f0dd6b1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:b8:85', 'vm-uuid': '6fe2a300-76bb-44b4-8828-f87977451114'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:13:32 compute-0 nova_compute[187212]: 2025-11-25 19:13:32.008 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:32 compute-0 NetworkManager[55552]: <info>  [1764098012.0098] manager: (tap5cceedef-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 25 19:13:32 compute-0 nova_compute[187212]: 2025-11-25 19:13:32.011 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:13:32 compute-0 nova_compute[187212]: 2025-11-25 19:13:32.019 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:32 compute-0 nova_compute[187212]: 2025-11-25 19:13:32.020 187216 INFO os_vif [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:b8:85,bridge_name='br-int',has_traffic_filtering=True,id=5cceedef-39bc-43df-be34-b65a3f0dd6b1,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cceedef-39')
Nov 25 19:13:33 compute-0 nova_compute[187212]: 2025-11-25 19:13:33.569 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:13:33 compute-0 nova_compute[187212]: 2025-11-25 19:13:33.570 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:13:33 compute-0 nova_compute[187212]: 2025-11-25 19:13:33.571 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] No VIF found with MAC fa:16:3e:cb:b8:85, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:13:33 compute-0 nova_compute[187212]: 2025-11-25 19:13:33.572 187216 INFO nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Using config drive
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.084 187216 WARNING neutronclient.v2_0.client [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.236 187216 INFO nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Creating config drive at /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk.config
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.247 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp1b59t5ao execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.392 187216 DEBUG oslo_concurrency.processutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp1b59t5ao" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:13:34 compute-0 kernel: tap5cceedef-39: entered promiscuous mode
Nov 25 19:13:34 compute-0 NetworkManager[55552]: <info>  [1764098014.4853] manager: (tap5cceedef-39): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.487 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:34 compute-0 ovn_controller[95465]: 2025-11-25T19:13:34Z|00058|binding|INFO|Claiming lport 5cceedef-39bc-43df-be34-b65a3f0dd6b1 for this chassis.
Nov 25 19:13:34 compute-0 ovn_controller[95465]: 2025-11-25T19:13:34Z|00059|binding|INFO|5cceedef-39bc-43df-be34-b65a3f0dd6b1: Claiming fa:16:3e:cb:b8:85 10.100.0.13
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.499 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:b8:85 10.100.0.13'], port_security=['fa:16:3e:cb:b8:85 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6fe2a300-76bb-44b4-8828-f87977451114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=5cceedef-39bc-43df-be34-b65a3f0dd6b1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.500 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 5cceedef-39bc-43df-be34-b65a3f0dd6b1 in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 bound to our chassis
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.503 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:13:34 compute-0 ovn_controller[95465]: 2025-11-25T19:13:34Z|00060|binding|INFO|Setting lport 5cceedef-39bc-43df-be34-b65a3f0dd6b1 ovn-installed in OVS
Nov 25 19:13:34 compute-0 ovn_controller[95465]: 2025-11-25T19:13:34Z|00061|binding|INFO|Setting lport 5cceedef-39bc-43df-be34-b65a3f0dd6b1 up in Southbound
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.520 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.523 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.526 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1c37f36c-6aa1-4dc8-95f2-1763e1bf307a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:13:34 compute-0 systemd-udevd[211465]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:13:34 compute-0 systemd-machined[153494]: New machine qemu-4-instance-00000007.
Nov 25 19:13:34 compute-0 NetworkManager[55552]: <info>  [1764098014.5666] device (tap5cceedef-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:13:34 compute-0 NetworkManager[55552]: <info>  [1764098014.5688] device (tap5cceedef-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:13:34 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.573 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[e2048026-5307-4a48-8a3d-a06fe057ceb3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.579 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea196b5-716f-421a-beea-6bb208ce8ccc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.627 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[afe786bc-4138-4748-b00e-f889088c89c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.651 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[287bcd76-e5f3-434d-a573-db8d234b6d5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 44288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211476, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.676 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[76b9e6d9-d14d-4f83-954f-607ede781825]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211477, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211477, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.677 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.679 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.680 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.681 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.682 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.682 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.683 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:13:34 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:34.685 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[267f8c88-2d0a-42c5-90e5-5c26c192ae4f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.700 187216 DEBUG nova.compute.manager [req-6eb00d3d-9924-4fd4-861c-0424e7f1ec1e req-01138921-41a6-4b5d-b980-7ff55f723583 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Received event network-vif-plugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.700 187216 DEBUG oslo_concurrency.lockutils [req-6eb00d3d-9924-4fd4-861c-0424e7f1ec1e req-01138921-41a6-4b5d-b980-7ff55f723583 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6fe2a300-76bb-44b4-8828-f87977451114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.701 187216 DEBUG oslo_concurrency.lockutils [req-6eb00d3d-9924-4fd4-861c-0424e7f1ec1e req-01138921-41a6-4b5d-b980-7ff55f723583 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.701 187216 DEBUG oslo_concurrency.lockutils [req-6eb00d3d-9924-4fd4-861c-0424e7f1ec1e req-01138921-41a6-4b5d-b980-7ff55f723583 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:34 compute-0 nova_compute[187212]: 2025-11-25 19:13:34.702 187216 DEBUG nova.compute.manager [req-6eb00d3d-9924-4fd4-861c-0424e7f1ec1e req-01138921-41a6-4b5d-b980-7ff55f723583 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Processing event network-vif-plugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.357 187216 DEBUG nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.362 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.367 187216 INFO nova.virt.libvirt.driver [-] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Instance spawned successfully.
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.368 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.884 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.885 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.886 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.886 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.887 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:13:35 compute-0 nova_compute[187212]: 2025-11-25 19:13:35.887 187216 DEBUG nova.virt.libvirt.driver [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:13:36 compute-0 podman[211486]: 2025-11-25 19:13:36.169842297 +0000 UTC m=+0.087935347 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, version=9.6)
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.398 187216 INFO nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Took 7.59 seconds to spawn the instance on the hypervisor.
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.399 187216 DEBUG nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.790 187216 DEBUG nova.compute.manager [req-98a515cb-9eca-43d3-bd5a-35897756354d req-b1ad2ceb-cbef-4ea7-b6a7-43cf49eb286c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Received event network-vif-plugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.791 187216 DEBUG oslo_concurrency.lockutils [req-98a515cb-9eca-43d3-bd5a-35897756354d req-b1ad2ceb-cbef-4ea7-b6a7-43cf49eb286c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6fe2a300-76bb-44b4-8828-f87977451114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.792 187216 DEBUG oslo_concurrency.lockutils [req-98a515cb-9eca-43d3-bd5a-35897756354d req-b1ad2ceb-cbef-4ea7-b6a7-43cf49eb286c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.793 187216 DEBUG oslo_concurrency.lockutils [req-98a515cb-9eca-43d3-bd5a-35897756354d req-b1ad2ceb-cbef-4ea7-b6a7-43cf49eb286c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.793 187216 DEBUG nova.compute.manager [req-98a515cb-9eca-43d3-bd5a-35897756354d req-b1ad2ceb-cbef-4ea7-b6a7-43cf49eb286c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] No waiting events found dispatching network-vif-plugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.794 187216 WARNING nova.compute.manager [req-98a515cb-9eca-43d3-bd5a-35897756354d req-b1ad2ceb-cbef-4ea7-b6a7-43cf49eb286c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Received unexpected event network-vif-plugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 for instance with vm_state active and task_state None.
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.851 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:36 compute-0 nova_compute[187212]: 2025-11-25 19:13:36.941 187216 INFO nova.compute.manager [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Took 14.50 seconds to build instance.
Nov 25 19:13:37 compute-0 nova_compute[187212]: 2025-11-25 19:13:37.008 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:37 compute-0 nova_compute[187212]: 2025-11-25 19:13:37.447 187216 DEBUG oslo_concurrency.lockutils [None req-a576cd3b-3542-432a-b180-3afeb2f35064 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.021s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:13:40 compute-0 podman[211507]: 2025-11-25 19:13:40.179588741 +0000 UTC m=+0.105738995 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:13:41 compute-0 nova_compute[187212]: 2025-11-25 19:13:41.853 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:42 compute-0 nova_compute[187212]: 2025-11-25 19:13:42.013 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:46 compute-0 nova_compute[187212]: 2025-11-25 19:13:46.857 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:47 compute-0 nova_compute[187212]: 2025-11-25 19:13:47.016 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:50 compute-0 ovn_controller[95465]: 2025-11-25T19:13:50Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cb:b8:85 10.100.0.13
Nov 25 19:13:50 compute-0 ovn_controller[95465]: 2025-11-25T19:13:50Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cb:b8:85 10.100.0.13
Nov 25 19:13:50 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:50.406 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:13:50 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:50.407 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:13:50 compute-0 nova_compute[187212]: 2025-11-25 19:13:50.447 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:51 compute-0 nova_compute[187212]: 2025-11-25 19:13:51.859 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:52 compute-0 nova_compute[187212]: 2025-11-25 19:13:52.017 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:52 compute-0 podman[211550]: 2025-11-25 19:13:52.199969597 +0000 UTC m=+0.115215394 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:13:56 compute-0 nova_compute[187212]: 2025-11-25 19:13:56.863 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:57 compute-0 nova_compute[187212]: 2025-11-25 19:13:57.019 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:13:59 compute-0 podman[211574]: 2025-11-25 19:13:59.216093666 +0000 UTC m=+0.129581977 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4)
Nov 25 19:13:59 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:13:59.409 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:13:59 compute-0 podman[197585]: time="2025-11-25T19:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:13:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:13:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3048 "" "Go-http-client/1.1"
Nov 25 19:14:01 compute-0 openstack_network_exporter[199731]: ERROR   19:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:14:01 compute-0 openstack_network_exporter[199731]: ERROR   19:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:14:01 compute-0 openstack_network_exporter[199731]: ERROR   19:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:14:01 compute-0 openstack_network_exporter[199731]: ERROR   19:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:14:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:14:01 compute-0 openstack_network_exporter[199731]: ERROR   19:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:14:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:14:01 compute-0 nova_compute[187212]: 2025-11-25 19:14:01.865 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:02 compute-0 nova_compute[187212]: 2025-11-25 19:14:02.021 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:02 compute-0 podman[211600]: 2025-11-25 19:14:02.145576929 +0000 UTC m=+0.064277690 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Nov 25 19:14:06 compute-0 nova_compute[187212]: 2025-11-25 19:14:06.369 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "02775601-3840-4250-809d-622ab3cf2e99" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:06 compute-0 nova_compute[187212]: 2025-11-25 19:14:06.370 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:06 compute-0 nova_compute[187212]: 2025-11-25 19:14:06.868 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:06 compute-0 nova_compute[187212]: 2025-11-25 19:14:06.878 187216 DEBUG nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:14:07 compute-0 nova_compute[187212]: 2025-11-25 19:14:07.024 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:07 compute-0 podman[211631]: 2025-11-25 19:14:07.175945769 +0000 UTC m=+0.092630296 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm)
Nov 25 19:14:07 compute-0 nova_compute[187212]: 2025-11-25 19:14:07.448 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:07 compute-0 nova_compute[187212]: 2025-11-25 19:14:07.449 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:07 compute-0 nova_compute[187212]: 2025-11-25 19:14:07.459 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:14:07 compute-0 nova_compute[187212]: 2025-11-25 19:14:07.459 187216 INFO nova.compute.claims [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:14:08 compute-0 nova_compute[187212]: 2025-11-25 19:14:08.560 187216 DEBUG nova.compute.provider_tree [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:14:09 compute-0 nova_compute[187212]: 2025-11-25 19:14:09.067 187216 DEBUG nova.scheduler.client.report [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:14:09 compute-0 nova_compute[187212]: 2025-11-25 19:14:09.577 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.128s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:09 compute-0 nova_compute[187212]: 2025-11-25 19:14:09.578 187216 DEBUG nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:14:10 compute-0 nova_compute[187212]: 2025-11-25 19:14:10.090 187216 DEBUG nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:14:10 compute-0 nova_compute[187212]: 2025-11-25 19:14:10.091 187216 DEBUG nova.network.neutron [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:14:10 compute-0 nova_compute[187212]: 2025-11-25 19:14:10.091 187216 WARNING neutronclient.v2_0.client [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:10 compute-0 nova_compute[187212]: 2025-11-25 19:14:10.092 187216 WARNING neutronclient.v2_0.client [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:10 compute-0 nova_compute[187212]: 2025-11-25 19:14:10.599 187216 INFO nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:14:11 compute-0 nova_compute[187212]: 2025-11-25 19:14:11.114 187216 DEBUG nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:14:11 compute-0 podman[211653]: 2025-11-25 19:14:11.167290331 +0000 UTC m=+0.089742997 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Nov 25 19:14:11 compute-0 nova_compute[187212]: 2025-11-25 19:14:11.212 187216 DEBUG nova.network.neutron [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Successfully created port: 03b3db32-8760-4f8f-8c29-8fe9aba447fe _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:14:11 compute-0 nova_compute[187212]: 2025-11-25 19:14:11.902 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.025 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.138 187216 DEBUG nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.140 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.141 187216 INFO nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Creating image(s)
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.142 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "/var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.142 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "/var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.143 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "/var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.144 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.151 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.153 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.221 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.222 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.223 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.223 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.226 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.227 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.240 187216 DEBUG nova.network.neutron [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Successfully updated port: 03b3db32-8760-4f8f-8c29-8fe9aba447fe _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.283 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.283 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.298 187216 DEBUG nova.compute.manager [req-7a57e41a-e4b1-4e17-ae3d-ec33f8d708b6 req-c1485686-179d-40fb-9992-fe2c2c5e178a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Received event network-changed-03b3db32-8760-4f8f-8c29-8fe9aba447fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.299 187216 DEBUG nova.compute.manager [req-7a57e41a-e4b1-4e17-ae3d-ec33f8d708b6 req-c1485686-179d-40fb-9992-fe2c2c5e178a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Refreshing instance network info cache due to event network-changed-03b3db32-8760-4f8f-8c29-8fe9aba447fe. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.299 187216 DEBUG oslo_concurrency.lockutils [req-7a57e41a-e4b1-4e17-ae3d-ec33f8d708b6 req-c1485686-179d-40fb-9992-fe2c2c5e178a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-02775601-3840-4250-809d-622ab3cf2e99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.299 187216 DEBUG oslo_concurrency.lockutils [req-7a57e41a-e4b1-4e17-ae3d-ec33f8d708b6 req-c1485686-179d-40fb-9992-fe2c2c5e178a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-02775601-3840-4250-809d-622ab3cf2e99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.299 187216 DEBUG nova.network.neutron [req-7a57e41a-e4b1-4e17-ae3d-ec33f8d708b6 req-c1485686-179d-40fb-9992-fe2c2c5e178a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Refreshing network info cache for port 03b3db32-8760-4f8f-8c29-8fe9aba447fe _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.319 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.319 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.320 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.401 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.402 187216 DEBUG nova.virt.disk.api [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Checking if we can resize image /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.402 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.466 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.468 187216 DEBUG nova.virt.disk.api [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Cannot resize image /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.469 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.469 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Ensure instance console log exists: /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.470 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.471 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.471 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.750 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "refresh_cache-02775601-3840-4250-809d-622ab3cf2e99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:14:12 compute-0 nova_compute[187212]: 2025-11-25 19:14:12.806 187216 WARNING neutronclient.v2_0.client [req-7a57e41a-e4b1-4e17-ae3d-ec33f8d708b6 req-c1485686-179d-40fb-9992-fe2c2c5e178a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:13 compute-0 nova_compute[187212]: 2025-11-25 19:14:13.055 187216 DEBUG nova.network.neutron [req-7a57e41a-e4b1-4e17-ae3d-ec33f8d708b6 req-c1485686-179d-40fb-9992-fe2c2c5e178a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:14:13 compute-0 nova_compute[187212]: 2025-11-25 19:14:13.249 187216 DEBUG nova.network.neutron [req-7a57e41a-e4b1-4e17-ae3d-ec33f8d708b6 req-c1485686-179d-40fb-9992-fe2c2c5e178a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:14:13 compute-0 nova_compute[187212]: 2025-11-25 19:14:13.758 187216 DEBUG oslo_concurrency.lockutils [req-7a57e41a-e4b1-4e17-ae3d-ec33f8d708b6 req-c1485686-179d-40fb-9992-fe2c2c5e178a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-02775601-3840-4250-809d-622ab3cf2e99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:14:13 compute-0 nova_compute[187212]: 2025-11-25 19:14:13.759 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquired lock "refresh_cache-02775601-3840-4250-809d-622ab3cf2e99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:14:13 compute-0 nova_compute[187212]: 2025-11-25 19:14:13.759 187216 DEBUG nova.network.neutron [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:14:14 compute-0 nova_compute[187212]: 2025-11-25 19:14:14.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:14 compute-0 nova_compute[187212]: 2025-11-25 19:14:14.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:14:14 compute-0 nova_compute[187212]: 2025-11-25 19:14:14.583 187216 DEBUG nova.network.neutron [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:14:14 compute-0 nova_compute[187212]: 2025-11-25 19:14:14.832 187216 WARNING neutronclient.v2_0.client [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:14 compute-0 nova_compute[187212]: 2025-11-25 19:14:14.982 187216 DEBUG nova.network.neutron [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Updating instance_info_cache with network_info: [{"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.490 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Releasing lock "refresh_cache-02775601-3840-4250-809d-622ab3cf2e99" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.492 187216 DEBUG nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Instance network_info: |[{"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.495 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Start _get_guest_xml network_info=[{"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.502 187216 WARNING nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.504 187216 DEBUG nova.virt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-740998153', uuid='02775601-3840-4250-809d-622ab3cf2e99'), owner=OwnerMeta(userid='7c561073d7c34a029574a6e2fb952944', username='tempest-TestExecuteActionsViaActuator-1103022868-project-admin', projectid='780511b4bf4d49299cc4d9b324261841', projectname='tempest-TestExecuteActionsViaActuator-1103022868'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764098055.5040522) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.508 187216 DEBUG nova.virt.libvirt.host [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.509 187216 DEBUG nova.virt.libvirt.host [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.512 187216 DEBUG nova.virt.libvirt.host [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.513 187216 DEBUG nova.virt.libvirt.host [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.514 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.515 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.516 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.516 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.516 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.517 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.517 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.517 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.518 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.518 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.518 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.518 187216 DEBUG nova.virt.hardware [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.524 187216 DEBUG nova.virt.libvirt.vif [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-740998153',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-740998153',id=9,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-c525g54q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:14:11Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=02775601-3840-4250-809d-622ab3cf2e99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.525 187216 DEBUG nova.network.os_vif_util [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.526 187216 DEBUG nova.network.os_vif_util [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:73:1f,bridge_name='br-int',has_traffic_filtering=True,id=03b3db32-8760-4f8f-8c29-8fe9aba447fe,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b3db32-87') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:14:15 compute-0 nova_compute[187212]: 2025-11-25 19:14:15.527 187216 DEBUG nova.objects.instance [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lazy-loading 'pci_devices' on Instance uuid 02775601-3840-4250-809d-622ab3cf2e99 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.035 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <uuid>02775601-3840-4250-809d-622ab3cf2e99</uuid>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <name>instance-00000009</name>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-740998153</nova:name>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:14:15</nova:creationTime>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:14:16 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:14:16 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:user uuid="7c561073d7c34a029574a6e2fb952944">tempest-TestExecuteActionsViaActuator-1103022868-project-admin</nova:user>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:project uuid="780511b4bf4d49299cc4d9b324261841">tempest-TestExecuteActionsViaActuator-1103022868</nova:project>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         <nova:port uuid="03b3db32-8760-4f8f-8c29-8fe9aba447fe">
Nov 25 19:14:16 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <system>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <entry name="serial">02775601-3840-4250-809d-622ab3cf2e99</entry>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <entry name="uuid">02775601-3840-4250-809d-622ab3cf2e99</entry>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     </system>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <os>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   </os>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <features>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   </features>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk.config"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:7f:73:1f"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <target dev="tap03b3db32-87"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/console.log" append="off"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <video>
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     </video>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:14:16 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:14:16 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:14:16 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:14:16 compute-0 nova_compute[187212]: </domain>
Nov 25 19:14:16 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.036 187216 DEBUG nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Preparing to wait for external event network-vif-plugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.036 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "02775601-3840-4250-809d-622ab3cf2e99-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.037 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.037 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.038 187216 DEBUG nova.virt.libvirt.vif [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-740998153',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-740998153',id=9,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-c525g54q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:14:11Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=02775601-3840-4250-809d-622ab3cf2e99,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.039 187216 DEBUG nova.network.os_vif_util [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.040 187216 DEBUG nova.network.os_vif_util [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:73:1f,bridge_name='br-int',has_traffic_filtering=True,id=03b3db32-8760-4f8f-8c29-8fe9aba447fe,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b3db32-87') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.040 187216 DEBUG os_vif [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:73:1f,bridge_name='br-int',has_traffic_filtering=True,id=03b3db32-8760-4f8f-8c29-8fe9aba447fe,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b3db32-87') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.042 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.042 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.043 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.045 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.045 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9973da65-b531-587e-9b4d-5097d7eec30e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.047 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.050 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.055 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.055 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03b3db32-87, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.056 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap03b3db32-87, col_values=(('qos', UUID('90a7514d-f3a8-4faa-a7b9-832e7c0e8bdc')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.056 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap03b3db32-87, col_values=(('external_ids', {'iface-id': '03b3db32-8760-4f8f-8c29-8fe9aba447fe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:73:1f', 'vm-uuid': '02775601-3840-4250-809d-622ab3cf2e99'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:16 compute-0 NetworkManager[55552]: <info>  [1764098056.0596] manager: (tap03b3db32-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.058 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.061 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.069 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.069 187216 INFO os_vif [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:73:1f,bridge_name='br-int',has_traffic_filtering=True,id=03b3db32-8760-4f8f-8c29-8fe9aba447fe,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b3db32-87')
Nov 25 19:14:16 compute-0 nova_compute[187212]: 2025-11-25 19:14:16.904 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:17 compute-0 nova_compute[187212]: 2025-11-25 19:14:17.628 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:14:17 compute-0 nova_compute[187212]: 2025-11-25 19:14:17.629 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:14:17 compute-0 nova_compute[187212]: 2025-11-25 19:14:17.629 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] No VIF found with MAC fa:16:3e:7f:73:1f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:14:17 compute-0 nova_compute[187212]: 2025-11-25 19:14:17.630 187216 INFO nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Using config drive
Nov 25 19:14:18 compute-0 nova_compute[187212]: 2025-11-25 19:14:18.143 187216 WARNING neutronclient.v2_0.client [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:18 compute-0 nova_compute[187212]: 2025-11-25 19:14:18.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:18 compute-0 nova_compute[187212]: 2025-11-25 19:14:18.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.146 187216 INFO nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Creating config drive at /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk.config
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.157 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpvvso6gxh execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.300 187216 DEBUG oslo_concurrency.processutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpvvso6gxh" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:19 compute-0 kernel: tap03b3db32-87: entered promiscuous mode
Nov 25 19:14:19 compute-0 NetworkManager[55552]: <info>  [1764098059.3886] manager: (tap03b3db32-87): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Nov 25 19:14:19 compute-0 ovn_controller[95465]: 2025-11-25T19:14:19Z|00062|binding|INFO|Claiming lport 03b3db32-8760-4f8f-8c29-8fe9aba447fe for this chassis.
Nov 25 19:14:19 compute-0 ovn_controller[95465]: 2025-11-25T19:14:19Z|00063|binding|INFO|03b3db32-8760-4f8f-8c29-8fe9aba447fe: Claiming fa:16:3e:7f:73:1f 10.100.0.8
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.449 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.458 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:73:1f 10.100.0.8'], port_security=['fa:16:3e:7f:73:1f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '02775601-3840-4250-809d-622ab3cf2e99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=03b3db32-8760-4f8f-8c29-8fe9aba447fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.459 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 03b3db32-8760-4f8f-8c29-8fe9aba447fe in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 bound to our chassis
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.462 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:14:19 compute-0 ovn_controller[95465]: 2025-11-25T19:14:19Z|00064|binding|INFO|Setting lport 03b3db32-8760-4f8f-8c29-8fe9aba447fe ovn-installed in OVS
Nov 25 19:14:19 compute-0 ovn_controller[95465]: 2025-11-25T19:14:19Z|00065|binding|INFO|Setting lport 03b3db32-8760-4f8f-8c29-8fe9aba447fe up in Southbound
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.476 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.485 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[982c4929-9d12-4427-ab37-804982282af8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:19 compute-0 systemd-machined[153494]: New machine qemu-5-instance-00000009.
Nov 25 19:14:19 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Nov 25 19:14:19 compute-0 systemd-udevd[211712]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.534 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[50ebc27c-eeee-4002-b654-fb4ba873b1b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.538 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[66f10391-d27b-4a6b-b7a4-ab32c31763bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:19 compute-0 NetworkManager[55552]: <info>  [1764098059.5518] device (tap03b3db32-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:14:19 compute-0 NetworkManager[55552]: <info>  [1764098059.5534] device (tap03b3db32-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.583 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[836cd8e6-317a-4e9f-a16b-2b8ace530a3b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.613 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[53317778-53e5-42e9-9fe3-2434229fffdf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 44288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211722, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.636 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ff1f06ad-67ab-4093-925b-bf0b4db2b1fd]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211723, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211723, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.637 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.639 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.640 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.641 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.641 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.642 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.642 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:14:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:19.644 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5a8efe-e19b-43ee-a753-26051b78e482]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.722 187216 DEBUG nova.compute.manager [req-ea402516-5004-4bad-a0d6-3cdabe9081de req-ce8c8de3-77db-4506-bfa7-3e03884514d9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Received event network-vif-plugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.723 187216 DEBUG oslo_concurrency.lockutils [req-ea402516-5004-4bad-a0d6-3cdabe9081de req-ce8c8de3-77db-4506-bfa7-3e03884514d9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "02775601-3840-4250-809d-622ab3cf2e99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.723 187216 DEBUG oslo_concurrency.lockutils [req-ea402516-5004-4bad-a0d6-3cdabe9081de req-ce8c8de3-77db-4506-bfa7-3e03884514d9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.723 187216 DEBUG oslo_concurrency.lockutils [req-ea402516-5004-4bad-a0d6-3cdabe9081de req-ce8c8de3-77db-4506-bfa7-3e03884514d9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.724 187216 DEBUG nova.compute.manager [req-ea402516-5004-4bad-a0d6-3cdabe9081de req-ce8c8de3-77db-4506-bfa7-3e03884514d9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Processing event network-vif-plugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.853 187216 DEBUG nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.857 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.861 187216 INFO nova.virt.libvirt.driver [-] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Instance spawned successfully.
Nov 25 19:14:19 compute-0 nova_compute[187212]: 2025-11-25 19:14:19.863 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.377 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.378 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.378 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.379 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.380 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.381 187216 DEBUG nova.virt.libvirt.driver [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.688 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.891 187216 INFO nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Took 8.75 seconds to spawn the instance on the hypervisor.
Nov 25 19:14:20 compute-0 nova_compute[187212]: 2025-11-25 19:14:20.892 187216 DEBUG nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.059 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.424 187216 INFO nova.compute.manager [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Took 14.03 seconds to build instance.
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.744 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.798 187216 DEBUG nova.compute.manager [req-e3af4382-8c04-4b50-a152-0e5172895700 req-635aac66-9b7f-4e7b-af25-c3d67d489148 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Received event network-vif-plugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.799 187216 DEBUG oslo_concurrency.lockutils [req-e3af4382-8c04-4b50-a152-0e5172895700 req-635aac66-9b7f-4e7b-af25-c3d67d489148 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "02775601-3840-4250-809d-622ab3cf2e99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.799 187216 DEBUG oslo_concurrency.lockutils [req-e3af4382-8c04-4b50-a152-0e5172895700 req-635aac66-9b7f-4e7b-af25-c3d67d489148 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.799 187216 DEBUG oslo_concurrency.lockutils [req-e3af4382-8c04-4b50-a152-0e5172895700 req-635aac66-9b7f-4e7b-af25-c3d67d489148 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.799 187216 DEBUG nova.compute.manager [req-e3af4382-8c04-4b50-a152-0e5172895700 req-635aac66-9b7f-4e7b-af25-c3d67d489148 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] No waiting events found dispatching network-vif-plugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.800 187216 WARNING nova.compute.manager [req-e3af4382-8c04-4b50-a152-0e5172895700 req-635aac66-9b7f-4e7b-af25-c3d67d489148 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Received unexpected event network-vif-plugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe for instance with vm_state active and task_state None.
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.838 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.838 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.903 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.906 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.910 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.934 187216 DEBUG oslo_concurrency.lockutils [None req-d19d6334-521c-44ee-862c-079fe950fccf 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.564s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.997 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:21 compute-0 nova_compute[187212]: 2025-11-25 19:14:21.998 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.054 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.063 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.114 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.114 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.163 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.170 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.225 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.226 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.273 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.466 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.467 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.495 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.496 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5337MB free_disk=72.91024017333984GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.496 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:22 compute-0 nova_compute[187212]: 2025-11-25 19:14:22.497 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:23 compute-0 podman[211758]: 2025-11-25 19:14:23.174585917 +0000 UTC m=+0.092463681 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:14:23 compute-0 nova_compute[187212]: 2025-11-25 19:14:23.571 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 909b423a-9e57-4bb8-b6b5-719b05724d71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:14:23 compute-0 nova_compute[187212]: 2025-11-25 19:14:23.572 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance dd2a5303-3518-4f79-aa7b-45fc96059d01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:14:23 compute-0 nova_compute[187212]: 2025-11-25 19:14:23.572 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 6fe2a300-76bb-44b4-8828-f87977451114 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:14:23 compute-0 nova_compute[187212]: 2025-11-25 19:14:23.572 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 02775601-3840-4250-809d-622ab3cf2e99 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:14:23 compute-0 nova_compute[187212]: 2025-11-25 19:14:23.573 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:14:23 compute-0 nova_compute[187212]: 2025-11-25 19:14:23.573 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1088MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:14:22 up  1:06,  0 user,  load average: 0.55, 0.45, 0.47\n', 'num_instances': '4', 'num_vm_active': '4', 'num_task_None': '4', 'num_os_type_None': '4', 'num_proj_780511b4bf4d49299cc4d9b324261841': '4', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:14:23 compute-0 nova_compute[187212]: 2025-11-25 19:14:23.654 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:14:24 compute-0 nova_compute[187212]: 2025-11-25 19:14:24.161 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:14:24 compute-0 nova_compute[187212]: 2025-11-25 19:14:24.671 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:14:24 compute-0 nova_compute[187212]: 2025-11-25 19:14:24.671 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.174s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:25 compute-0 nova_compute[187212]: 2025-11-25 19:14:25.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:25 compute-0 nova_compute[187212]: 2025-11-25 19:14:25.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:25 compute-0 nova_compute[187212]: 2025-11-25 19:14:25.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:25 compute-0 nova_compute[187212]: 2025-11-25 19:14:25.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:14:25 compute-0 nova_compute[187212]: 2025-11-25 19:14:25.682 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:14:26 compute-0 nova_compute[187212]: 2025-11-25 19:14:26.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:26 compute-0 nova_compute[187212]: 2025-11-25 19:14:26.683 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:26 compute-0 nova_compute[187212]: 2025-11-25 19:14:26.911 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:27 compute-0 nova_compute[187212]: 2025-11-25 19:14:27.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:27 compute-0 nova_compute[187212]: 2025-11-25 19:14:27.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:14:29 compute-0 nova_compute[187212]: 2025-11-25 19:14:29.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:14:29 compute-0 podman[197585]: time="2025-11-25T19:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:14:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:14:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3051 "" "Go-http-client/1.1"
Nov 25 19:14:30 compute-0 podman[211782]: 2025-11-25 19:14:30.243635044 +0000 UTC m=+0.165987892 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:14:31 compute-0 nova_compute[187212]: 2025-11-25 19:14:31.062 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:31.083 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:31.084 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:31.084 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:31 compute-0 openstack_network_exporter[199731]: ERROR   19:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:14:31 compute-0 openstack_network_exporter[199731]: ERROR   19:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:14:31 compute-0 openstack_network_exporter[199731]: ERROR   19:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:14:31 compute-0 openstack_network_exporter[199731]: ERROR   19:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:14:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:14:31 compute-0 openstack_network_exporter[199731]: ERROR   19:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:14:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:14:31 compute-0 nova_compute[187212]: 2025-11-25 19:14:31.912 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:33 compute-0 podman[211830]: 2025-11-25 19:14:33.179093381 +0000 UTC m=+0.102847595 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 19:14:34 compute-0 ovn_controller[95465]: 2025-11-25T19:14:34Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:73:1f 10.100.0.8
Nov 25 19:14:34 compute-0 ovn_controller[95465]: 2025-11-25T19:14:34Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:73:1f 10.100.0.8
Nov 25 19:14:34 compute-0 nova_compute[187212]: 2025-11-25 19:14:34.614 187216 DEBUG nova.compute.manager [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Nov 25 19:14:35 compute-0 nova_compute[187212]: 2025-11-25 19:14:35.189 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:35 compute-0 nova_compute[187212]: 2025-11-25 19:14:35.190 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:35 compute-0 nova_compute[187212]: 2025-11-25 19:14:35.706 187216 DEBUG nova.objects.instance [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'pci_requests' on Instance uuid 6b3e3c17-d75c-4789-b83d-55f74f5d041f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:14:35 compute-0 nova_compute[187212]: 2025-11-25 19:14:35.856 187216 DEBUG nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Creating tmpfile /var/lib/nova/instances/tmp6q0k6_by to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Nov 25 19:14:35 compute-0 nova_compute[187212]: 2025-11-25 19:14:35.857 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:35 compute-0 nova_compute[187212]: 2025-11-25 19:14:35.962 187216 DEBUG nova.compute.manager [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=69632,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6q0k6_by',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Nov 25 19:14:36 compute-0 nova_compute[187212]: 2025-11-25 19:14:36.064 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:36 compute-0 nova_compute[187212]: 2025-11-25 19:14:36.218 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:14:36 compute-0 nova_compute[187212]: 2025-11-25 19:14:36.218 187216 INFO nova.compute.claims [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:14:36 compute-0 nova_compute[187212]: 2025-11-25 19:14:36.219 187216 DEBUG nova.objects.instance [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'resources' on Instance uuid 6b3e3c17-d75c-4789-b83d-55f74f5d041f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:14:36 compute-0 nova_compute[187212]: 2025-11-25 19:14:36.726 187216 DEBUG nova.objects.base [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Object Instance<6b3e3c17-d75c-4789-b83d-55f74f5d041f> lazy-loaded attributes: pci_requests,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Nov 25 19:14:36 compute-0 nova_compute[187212]: 2025-11-25 19:14:36.727 187216 DEBUG nova.objects.instance [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'numa_topology' on Instance uuid 6b3e3c17-d75c-4789-b83d-55f74f5d041f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:14:36 compute-0 nova_compute[187212]: 2025-11-25 19:14:36.914 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:37 compute-0 nova_compute[187212]: 2025-11-25 19:14:37.266 187216 DEBUG nova.objects.base [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Object Instance<6b3e3c17-d75c-4789-b83d-55f74f5d041f> lazy-loaded attributes: pci_requests,resources,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Nov 25 19:14:37 compute-0 nova_compute[187212]: 2025-11-25 19:14:37.267 187216 DEBUG nova.objects.instance [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6b3e3c17-d75c-4789-b83d-55f74f5d041f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:14:37 compute-0 nova_compute[187212]: 2025-11-25 19:14:37.783 187216 DEBUG nova.objects.base [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Object Instance<6b3e3c17-d75c-4789-b83d-55f74f5d041f> lazy-loaded attributes: pci_requests,resources,numa_topology,pci_devices wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Nov 25 19:14:37 compute-0 nova_compute[187212]: 2025-11-25 19:14:37.996 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:38 compute-0 podman[211849]: 2025-11-25 19:14:38.20786367 +0000 UTC m=+0.114652109 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 19:14:38 compute-0 nova_compute[187212]: 2025-11-25 19:14:38.319 187216 INFO nova.compute.resource_tracker [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Updating resource usage from migration 1c8619ae-2ce3-457e-979d-c22602d841af
Nov 25 19:14:38 compute-0 nova_compute[187212]: 2025-11-25 19:14:38.319 187216 DEBUG nova.compute.resource_tracker [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Starting to track incoming migration 1c8619ae-2ce3-457e-979d-c22602d841af with flavor d7d5bae9-10ca-4750-9d69-ce73a869da56 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Nov 25 19:14:39 compute-0 nova_compute[187212]: 2025-11-25 19:14:39.023 187216 DEBUG nova.compute.provider_tree [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:14:39 compute-0 nova_compute[187212]: 2025-11-25 19:14:39.532 187216 DEBUG nova.scheduler.client.report [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:14:40 compute-0 nova_compute[187212]: 2025-11-25 19:14:40.044 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 4.854s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:40 compute-0 nova_compute[187212]: 2025-11-25 19:14:40.044 187216 INFO nova.compute.manager [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Migrating
Nov 25 19:14:41 compute-0 nova_compute[187212]: 2025-11-25 19:14:41.067 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:41 compute-0 nova_compute[187212]: 2025-11-25 19:14:41.947 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:42 compute-0 podman[211870]: 2025-11-25 19:14:42.162287838 +0000 UTC m=+0.080746570 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 19:14:43 compute-0 nova_compute[187212]: 2025-11-25 19:14:43.028 187216 DEBUG nova.compute.manager [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=69632,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6q0k6_by',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='26b4dae6-9c02-403b-b6cb-faf8fa8bb35a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Nov 25 19:14:44 compute-0 nova_compute[187212]: 2025-11-25 19:14:44.043 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-26b4dae6-9c02-403b-b6cb-faf8fa8bb35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:14:44 compute-0 nova_compute[187212]: 2025-11-25 19:14:44.043 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-26b4dae6-9c02-403b-b6cb-faf8fa8bb35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:14:44 compute-0 nova_compute[187212]: 2025-11-25 19:14:44.044 187216 DEBUG nova.network.neutron [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:14:44 compute-0 sshd-session[211890]: Accepted publickey for nova from 192.168.122.101 port 42144 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:14:44 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 25 19:14:44 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 25 19:14:44 compute-0 systemd-logind[820]: New session 34 of user nova.
Nov 25 19:14:44 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 25 19:14:44 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 25 19:14:44 compute-0 systemd[211894]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:14:44 compute-0 systemd[211894]: Queued start job for default target Main User Target.
Nov 25 19:14:44 compute-0 systemd[211894]: Created slice User Application Slice.
Nov 25 19:14:44 compute-0 systemd[211894]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 25 19:14:44 compute-0 systemd[211894]: Started Daily Cleanup of User's Temporary Directories.
Nov 25 19:14:44 compute-0 systemd[211894]: Reached target Paths.
Nov 25 19:14:44 compute-0 systemd[211894]: Reached target Timers.
Nov 25 19:14:44 compute-0 systemd[211894]: Starting D-Bus User Message Bus Socket...
Nov 25 19:14:44 compute-0 systemd[211894]: Starting Create User's Volatile Files and Directories...
Nov 25 19:14:44 compute-0 systemd[211894]: Finished Create User's Volatile Files and Directories.
Nov 25 19:14:44 compute-0 systemd[211894]: Listening on D-Bus User Message Bus Socket.
Nov 25 19:14:44 compute-0 systemd[211894]: Reached target Sockets.
Nov 25 19:14:44 compute-0 systemd[211894]: Reached target Basic System.
Nov 25 19:14:44 compute-0 systemd[211894]: Reached target Main User Target.
Nov 25 19:14:44 compute-0 systemd[211894]: Startup finished in 162ms.
Nov 25 19:14:44 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 25 19:14:44 compute-0 systemd[1]: Started Session 34 of User nova.
Nov 25 19:14:44 compute-0 sshd-session[211890]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:14:44 compute-0 nova_compute[187212]: 2025-11-25 19:14:44.550 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:44 compute-0 sshd-session[211909]: Received disconnect from 192.168.122.101 port 42144:11: disconnected by user
Nov 25 19:14:44 compute-0 sshd-session[211909]: Disconnected from user nova 192.168.122.101 port 42144
Nov 25 19:14:44 compute-0 sshd-session[211890]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:14:44 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Nov 25 19:14:44 compute-0 systemd-logind[820]: Session 34 logged out. Waiting for processes to exit.
Nov 25 19:14:44 compute-0 systemd-logind[820]: Removed session 34.
Nov 25 19:14:44 compute-0 sshd-session[211911]: Accepted publickey for nova from 192.168.122.101 port 42160 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:14:44 compute-0 systemd-logind[820]: New session 36 of user nova.
Nov 25 19:14:44 compute-0 systemd[1]: Started Session 36 of User nova.
Nov 25 19:14:44 compute-0 sshd-session[211911]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:14:44 compute-0 sshd-session[211914]: Received disconnect from 192.168.122.101 port 42160:11: disconnected by user
Nov 25 19:14:44 compute-0 sshd-session[211914]: Disconnected from user nova 192.168.122.101 port 42160
Nov 25 19:14:44 compute-0 sshd-session[211911]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:14:44 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Nov 25 19:14:44 compute-0 systemd-logind[820]: Session 36 logged out. Waiting for processes to exit.
Nov 25 19:14:44 compute-0 systemd-logind[820]: Removed session 36.
Nov 25 19:14:45 compute-0 nova_compute[187212]: 2025-11-25 19:14:45.042 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:45 compute-0 nova_compute[187212]: 2025-11-25 19:14:45.260 187216 DEBUG nova.network.neutron [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Updating instance_info_cache with network_info: [{"id": "1d533527-0957-4c0b-893c-65f597515760", "address": "fa:16:3e:53:b0:8d", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d533527-09", "ovs_interfaceid": "1d533527-0957-4c0b-893c-65f597515760", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:14:45 compute-0 nova_compute[187212]: 2025-11-25 19:14:45.769 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-26b4dae6-9c02-403b-b6cb-faf8fa8bb35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:14:45 compute-0 nova_compute[187212]: 2025-11-25 19:14:45.782 187216 DEBUG nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=69632,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6q0k6_by',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='26b4dae6-9c02-403b-b6cb-faf8fa8bb35a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Nov 25 19:14:45 compute-0 nova_compute[187212]: 2025-11-25 19:14:45.783 187216 DEBUG nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Creating instance directory: /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Nov 25 19:14:45 compute-0 nova_compute[187212]: 2025-11-25 19:14:45.783 187216 DEBUG nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Creating disk.info with the contents: {'/var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk': 'qcow2', '/var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Nov 25 19:14:45 compute-0 nova_compute[187212]: 2025-11-25 19:14:45.784 187216 DEBUG nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Nov 25 19:14:45 compute-0 nova_compute[187212]: 2025-11-25 19:14:45.785 187216 DEBUG nova.objects.instance [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.069 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.293 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.300 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.305 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.395 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.397 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.397 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.398 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.404 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.405 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.480 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.482 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.534 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.536 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.537 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.613 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.616 187216 DEBUG nova.virt.disk.api [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Checking if we can resize image /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.617 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.681 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.683 187216 DEBUG nova.virt.disk.api [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Cannot resize image /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.684 187216 DEBUG nova.objects.instance [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'migration_context' on Instance uuid 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:14:46 compute-0 nova_compute[187212]: 2025-11-25 19:14:46.951 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.250 187216 DEBUG nova.objects.base [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Object Instance<26b4dae6-9c02-403b-b6cb-faf8fa8bb35a> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.251 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.280 187216 DEBUG oslo_concurrency.processutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk.config 497664" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.281 187216 DEBUG nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.282 187216 DEBUG nova.virt.libvirt.vif [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-11-25T19:12:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-754987485',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-754987485',id=6,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:13:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-blialy70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:13:14Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=26b4dae6-9c02-403b-b6cb-faf8fa8bb35a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d533527-0957-4c0b-893c-65f597515760", "address": "fa:16:3e:53:b0:8d", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1d533527-09", "ovs_interfaceid": "1d533527-0957-4c0b-893c-65f597515760", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.283 187216 DEBUG nova.network.os_vif_util [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "1d533527-0957-4c0b-893c-65f597515760", "address": "fa:16:3e:53:b0:8d", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1d533527-09", "ovs_interfaceid": "1d533527-0957-4c0b-893c-65f597515760", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.283 187216 DEBUG nova.network.os_vif_util [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:b0:8d,bridge_name='br-int',has_traffic_filtering=True,id=1d533527-0957-4c0b-893c-65f597515760,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d533527-09') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.284 187216 DEBUG os_vif [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:b0:8d,bridge_name='br-int',has_traffic_filtering=True,id=1d533527-0957-4c0b-893c-65f597515760,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d533527-09') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.284 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.285 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.285 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.286 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.286 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '069240ff-7be4-56f7-a490-319b770e4824', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.287 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.289 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.293 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.293 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d533527-09, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.293 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1d533527-09, col_values=(('qos', UUID('15e76350-7c5e-41b7-b005-0e0887c59dd4')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.293 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1d533527-09, col_values=(('external_ids', {'iface-id': '1d533527-0957-4c0b-893c-65f597515760', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:b0:8d', 'vm-uuid': '26b4dae6-9c02-403b-b6cb-faf8fa8bb35a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.294 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:47 compute-0 NetworkManager[55552]: <info>  [1764098087.2960] manager: (tap1d533527-09): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.297 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.305 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.307 187216 INFO os_vif [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:b0:8d,bridge_name='br-int',has_traffic_filtering=True,id=1d533527-0957-4c0b-893c-65f597515760,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d533527-09')
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.308 187216 DEBUG nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.308 187216 DEBUG nova.compute.manager [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=69632,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6q0k6_by',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='26b4dae6-9c02-403b-b6cb-faf8fa8bb35a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.310 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.507 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.572 187216 DEBUG nova.compute.manager [req-5495100d-ddf4-4fed-9d5b-832276a5bc66 req-a503f777-7bf4-4b41-9f4d-b19c9df7df0e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.572 187216 DEBUG oslo_concurrency.lockutils [req-5495100d-ddf4-4fed-9d5b-832276a5bc66 req-a503f777-7bf4-4b41-9f4d-b19c9df7df0e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.573 187216 DEBUG oslo_concurrency.lockutils [req-5495100d-ddf4-4fed-9d5b-832276a5bc66 req-a503f777-7bf4-4b41-9f4d-b19c9df7df0e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.573 187216 DEBUG oslo_concurrency.lockutils [req-5495100d-ddf4-4fed-9d5b-832276a5bc66 req-a503f777-7bf4-4b41-9f4d-b19c9df7df0e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.574 187216 DEBUG nova.compute.manager [req-5495100d-ddf4-4fed-9d5b-832276a5bc66 req-a503f777-7bf4-4b41-9f4d-b19c9df7df0e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] No waiting events found dispatching network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:14:47 compute-0 nova_compute[187212]: 2025-11-25 19:14:47.574 187216 WARNING nova.compute.manager [req-5495100d-ddf4-4fed-9d5b-832276a5bc66 req-a503f777-7bf4-4b41-9f4d-b19c9df7df0e 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received unexpected event network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf for instance with vm_state active and task_state resize_migrating.
Nov 25 19:14:48 compute-0 sshd-session[211936]: Accepted publickey for nova from 192.168.122.101 port 42162 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:14:48 compute-0 systemd-logind[820]: New session 37 of user nova.
Nov 25 19:14:48 compute-0 systemd[1]: Started Session 37 of User nova.
Nov 25 19:14:48 compute-0 sshd-session[211936]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:14:48 compute-0 nova_compute[187212]: 2025-11-25 19:14:48.451 187216 DEBUG nova.network.neutron [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Port 1d533527-0957-4c0b-893c-65f597515760 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Nov 25 19:14:48 compute-0 nova_compute[187212]: 2025-11-25 19:14:48.538 187216 DEBUG nova.compute.manager [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=69632,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6q0k6_by',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='26b4dae6-9c02-403b-b6cb-faf8fa8bb35a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Nov 25 19:14:48 compute-0 sshd-session[211939]: Received disconnect from 192.168.122.101 port 42162:11: disconnected by user
Nov 25 19:14:48 compute-0 sshd-session[211939]: Disconnected from user nova 192.168.122.101 port 42162
Nov 25 19:14:48 compute-0 sshd-session[211936]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:14:48 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Nov 25 19:14:48 compute-0 systemd-logind[820]: Session 37 logged out. Waiting for processes to exit.
Nov 25 19:14:48 compute-0 systemd-logind[820]: Removed session 37.
Nov 25 19:14:49 compute-0 sshd-session[211941]: Accepted publickey for nova from 192.168.122.101 port 42178 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:14:49 compute-0 nova_compute[187212]: 2025-11-25 19:14:49.628 187216 DEBUG nova.compute.manager [req-e5dc1755-195e-448b-8f1c-2a003c9e62f9 req-072ae6f7-3b3c-4ac7-accb-a8be3e8f7a99 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:14:49 compute-0 nova_compute[187212]: 2025-11-25 19:14:49.629 187216 DEBUG oslo_concurrency.lockutils [req-e5dc1755-195e-448b-8f1c-2a003c9e62f9 req-072ae6f7-3b3c-4ac7-accb-a8be3e8f7a99 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:49 compute-0 nova_compute[187212]: 2025-11-25 19:14:49.630 187216 DEBUG oslo_concurrency.lockutils [req-e5dc1755-195e-448b-8f1c-2a003c9e62f9 req-072ae6f7-3b3c-4ac7-accb-a8be3e8f7a99 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:49 compute-0 nova_compute[187212]: 2025-11-25 19:14:49.630 187216 DEBUG oslo_concurrency.lockutils [req-e5dc1755-195e-448b-8f1c-2a003c9e62f9 req-072ae6f7-3b3c-4ac7-accb-a8be3e8f7a99 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:49 compute-0 nova_compute[187212]: 2025-11-25 19:14:49.630 187216 DEBUG nova.compute.manager [req-e5dc1755-195e-448b-8f1c-2a003c9e62f9 req-072ae6f7-3b3c-4ac7-accb-a8be3e8f7a99 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] No waiting events found dispatching network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:14:49 compute-0 nova_compute[187212]: 2025-11-25 19:14:49.630 187216 WARNING nova.compute.manager [req-e5dc1755-195e-448b-8f1c-2a003c9e62f9 req-072ae6f7-3b3c-4ac7-accb-a8be3e8f7a99 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received unexpected event network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf for instance with vm_state active and task_state resize_migrating.
Nov 25 19:14:49 compute-0 systemd-logind[820]: New session 38 of user nova.
Nov 25 19:14:49 compute-0 systemd[1]: Started Session 38 of User nova.
Nov 25 19:14:49 compute-0 sshd-session[211941]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:14:49 compute-0 sshd-session[211944]: Received disconnect from 192.168.122.101 port 42178:11: disconnected by user
Nov 25 19:14:49 compute-0 sshd-session[211944]: Disconnected from user nova 192.168.122.101 port 42178
Nov 25 19:14:49 compute-0 sshd-session[211941]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:14:49 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Nov 25 19:14:49 compute-0 systemd-logind[820]: Session 38 logged out. Waiting for processes to exit.
Nov 25 19:14:49 compute-0 systemd-logind[820]: Removed session 38.
Nov 25 19:14:49 compute-0 sshd-session[211946]: Accepted publickey for nova from 192.168.122.101 port 42190 ssh2: ECDSA SHA256:7f97V+BtuG/G8AzFyBc95O9wYeKTsJWYe9xE+clYnE4
Nov 25 19:14:49 compute-0 systemd-logind[820]: New session 39 of user nova.
Nov 25 19:14:49 compute-0 systemd[1]: Started Session 39 of User nova.
Nov 25 19:14:49 compute-0 sshd-session[211946]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 25 19:14:50 compute-0 sshd-session[211949]: Received disconnect from 192.168.122.101 port 42190:11: disconnected by user
Nov 25 19:14:50 compute-0 sshd-session[211949]: Disconnected from user nova 192.168.122.101 port 42190
Nov 25 19:14:50 compute-0 sshd-session[211946]: pam_unix(sshd:session): session closed for user nova
Nov 25 19:14:50 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Nov 25 19:14:50 compute-0 systemd-logind[820]: Session 39 logged out. Waiting for processes to exit.
Nov 25 19:14:50 compute-0 systemd-logind[820]: Removed session 39.
Nov 25 19:14:51 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 25 19:14:51 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 25 19:14:52 compute-0 nova_compute[187212]: 2025-11-25 19:14:51.999 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:52 compute-0 kernel: tap1d533527-09: entered promiscuous mode
Nov 25 19:14:52 compute-0 NetworkManager[55552]: <info>  [1764098092.1660] manager: (tap1d533527-09): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Nov 25 19:14:52 compute-0 nova_compute[187212]: 2025-11-25 19:14:52.169 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:52 compute-0 ovn_controller[95465]: 2025-11-25T19:14:52Z|00066|binding|INFO|Claiming lport 1d533527-0957-4c0b-893c-65f597515760 for this additional chassis.
Nov 25 19:14:52 compute-0 ovn_controller[95465]: 2025-11-25T19:14:52Z|00067|binding|INFO|1d533527-0957-4c0b-893c-65f597515760: Claiming fa:16:3e:53:b0:8d 10.100.0.4
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.180 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:b0:8d 10.100.0.4'], port_security=['fa:16:3e:53:b0:8d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '26b4dae6-9c02-403b-b6cb-faf8fa8bb35a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[], logical_port=1d533527-0957-4c0b-893c-65f597515760) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.181 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 1d533527-0957-4c0b-893c-65f597515760 in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 unbound from our chassis
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.183 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:14:52 compute-0 ovn_controller[95465]: 2025-11-25T19:14:52Z|00068|binding|INFO|Setting lport 1d533527-0957-4c0b-893c-65f597515760 ovn-installed in OVS
Nov 25 19:14:52 compute-0 nova_compute[187212]: 2025-11-25 19:14:52.193 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:52 compute-0 nova_compute[187212]: 2025-11-25 19:14:52.195 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.200 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[8291de49-2ae4-49c9-901c-0238aa695a9b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:52 compute-0 nova_compute[187212]: 2025-11-25 19:14:52.210 187216 WARNING neutronclient.v2_0.client [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:52 compute-0 systemd-machined[153494]: New machine qemu-6-instance-00000006.
Nov 25 19:14:52 compute-0 systemd-udevd[211988]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:14:52 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.234 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[09a80e7e-e15f-4f52-bec0-9d04ce5e43c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.236 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6d430f-a96d-4284-b181-930703a2ac2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:52 compute-0 NetworkManager[55552]: <info>  [1764098092.2454] device (tap1d533527-09): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:14:52 compute-0 NetworkManager[55552]: <info>  [1764098092.2468] device (tap1d533527-09): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.271 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[6d669167-d304-487c-9c40-acb8f536bc67]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.289 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[60fc527a-a250-4d96-96d8-0047dec3183c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 11, 'rx_bytes': 868, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 11, 'rx_bytes': 868, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 42313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211993, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:52 compute-0 nova_compute[187212]: 2025-11-25 19:14:52.295 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.303 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[82933ef8-142e-42bd-97e8-01253f6b9694]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211997, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211997, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.305 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:52 compute-0 nova_compute[187212]: 2025-11-25 19:14:52.307 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.309 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.309 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.309 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.309 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:14:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:52.311 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[870b5808-c5f2-4b1d-9dae-9018cb82edb0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:14:53 compute-0 nova_compute[187212]: 2025-11-25 19:14:53.092 187216 INFO nova.network.neutron [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Updating port 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 25 19:14:54 compute-0 podman[212017]: 2025-11-25 19:14:54.202751794 +0000 UTC m=+0.117815815 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:14:54 compute-0 nova_compute[187212]: 2025-11-25 19:14:54.667 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-6b3e3c17-d75c-4789-b83d-55f74f5d041f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:14:54 compute-0 nova_compute[187212]: 2025-11-25 19:14:54.668 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-6b3e3c17-d75c-4789-b83d-55f74f5d041f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:14:54 compute-0 nova_compute[187212]: 2025-11-25 19:14:54.668 187216 DEBUG nova.network.neutron [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:14:54 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:54.777 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:14:54 compute-0 nova_compute[187212]: 2025-11-25 19:14:54.778 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:54 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:14:54.779 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:14:54 compute-0 nova_compute[187212]: 2025-11-25 19:14:54.780 187216 DEBUG nova.compute.manager [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-changed-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:14:54 compute-0 nova_compute[187212]: 2025-11-25 19:14:54.781 187216 DEBUG nova.compute.manager [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Refreshing instance network info cache due to event network-changed-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:14:54 compute-0 nova_compute[187212]: 2025-11-25 19:14:54.781 187216 DEBUG oslo_concurrency.lockutils [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-6b3e3c17-d75c-4789-b83d-55f74f5d041f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:14:54 compute-0 ovn_controller[95465]: 2025-11-25T19:14:54Z|00069|binding|INFO|Claiming lport 1d533527-0957-4c0b-893c-65f597515760 for this chassis.
Nov 25 19:14:54 compute-0 ovn_controller[95465]: 2025-11-25T19:14:54Z|00070|binding|INFO|1d533527-0957-4c0b-893c-65f597515760: Claiming fa:16:3e:53:b0:8d 10.100.0.4
Nov 25 19:14:54 compute-0 ovn_controller[95465]: 2025-11-25T19:14:54Z|00071|binding|INFO|Setting lport 1d533527-0957-4c0b-893c-65f597515760 up in Southbound
Nov 25 19:14:55 compute-0 nova_compute[187212]: 2025-11-25 19:14:55.199 187216 WARNING neutronclient.v2_0.client [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:56 compute-0 nova_compute[187212]: 2025-11-25 19:14:56.305 187216 INFO nova.compute.manager [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Post operation of migration started
Nov 25 19:14:56 compute-0 nova_compute[187212]: 2025-11-25 19:14:56.306 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:56 compute-0 nova_compute[187212]: 2025-11-25 19:14:56.462 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:56 compute-0 nova_compute[187212]: 2025-11-25 19:14:56.463 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:56 compute-0 nova_compute[187212]: 2025-11-25 19:14:56.533 187216 WARNING neutronclient.v2_0.client [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:56 compute-0 nova_compute[187212]: 2025-11-25 19:14:56.631 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-26b4dae6-9c02-403b-b6cb-faf8fa8bb35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:14:56 compute-0 nova_compute[187212]: 2025-11-25 19:14:56.632 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-26b4dae6-9c02-403b-b6cb-faf8fa8bb35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:14:56 compute-0 nova_compute[187212]: 2025-11-25 19:14:56.633 187216 DEBUG nova.network.neutron [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:14:56 compute-0 nova_compute[187212]: 2025-11-25 19:14:56.788 187216 DEBUG nova.network.neutron [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Updating instance_info_cache with network_info: [{"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:14:57 compute-0 nova_compute[187212]: 2025-11-25 19:14:57.004 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:57 compute-0 nova_compute[187212]: 2025-11-25 19:14:57.141 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:57 compute-0 nova_compute[187212]: 2025-11-25 19:14:57.297 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:57 compute-0 nova_compute[187212]: 2025-11-25 19:14:57.367 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-6b3e3c17-d75c-4789-b83d-55f74f5d041f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:14:57 compute-0 nova_compute[187212]: 2025-11-25 19:14:57.373 187216 DEBUG oslo_concurrency.lockutils [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-6b3e3c17-d75c-4789-b83d-55f74f5d041f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:14:57 compute-0 nova_compute[187212]: 2025-11-25 19:14:57.374 187216 DEBUG nova.network.neutron [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Refreshing network info cache for port 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:14:57 compute-0 nova_compute[187212]: 2025-11-25 19:14:57.891 187216 WARNING neutronclient.v2_0.client [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.004 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.006 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.007 187216 INFO nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Creating image(s)
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.009 187216 DEBUG nova.objects.instance [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6b3e3c17-d75c-4789-b83d-55f74f5d041f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.545 187216 DEBUG oslo_concurrency.processutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.607 187216 DEBUG oslo_concurrency.processutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.608 187216 DEBUG nova.virt.disk.api [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Checking if we can resize image /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.609 187216 DEBUG oslo_concurrency.processutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.682 187216 DEBUG oslo_concurrency.processutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:58 compute-0 nova_compute[187212]: 2025-11-25 19:14:58.683 187216 DEBUG nova.virt.disk.api [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Cannot resize image /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.184 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.190 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.190 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Ensure instance console log exists: /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.191 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.191 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.191 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.194 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Start _get_guest_xml network_info=[{"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:98:25:60"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.202 187216 WARNING nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.204 187216 DEBUG nova.virt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-515888401', uuid='6b3e3c17-d75c-4789-b83d-55f74f5d041f'), owner=OwnerMeta(userid='7c561073d7c34a029574a6e2fb952944', username='tempest-TestExecuteActionsViaActuator-1103022868-project-admin', projectid='780511b4bf4d49299cc4d9b324261841', projectname='tempest-TestExecuteActionsViaActuator-1103022868'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:98:25:60"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764098099.2040174) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.210 187216 DEBUG nova.virt.libvirt.host [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.211 187216 DEBUG nova.virt.libvirt.host [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.215 187216 DEBUG nova.virt.libvirt.host [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.216 187216 DEBUG nova.virt.libvirt.host [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.217 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.218 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.218 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.218 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.219 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.219 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.219 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.219 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.220 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.220 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.220 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.220 187216 DEBUG nova.virt.hardware [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.221 187216 DEBUG nova.objects.instance [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6b3e3c17-d75c-4789-b83d-55f74f5d041f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.270 187216 WARNING neutronclient.v2_0.client [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.381 187216 DEBUG nova.network.neutron [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Updating instance_info_cache with network_info: [{"id": "1d533527-0957-4c0b-893c-65f597515760", "address": "fa:16:3e:53:b0:8d", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d533527-09", "ovs_interfaceid": "1d533527-0957-4c0b-893c-65f597515760", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.727 187216 DEBUG nova.objects.base [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Object Instance<6b3e3c17-d75c-4789-b83d-55f74f5d041f> lazy-loaded attributes: trusted_certs,vcpu_model wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.733 187216 DEBUG oslo_concurrency.processutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:14:59 compute-0 podman[197585]: time="2025-11-25T19:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:14:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:14:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3055 "" "Go-http-client/1.1"
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.799 187216 DEBUG oslo_concurrency.processutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk.config --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.800 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "/var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.801 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "/var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.802 187216 DEBUG oslo_concurrency.lockutils [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "/var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.803 187216 DEBUG nova.virt.libvirt.vif [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:13:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-515888401',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-515888401',id=8,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:14:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-1y0enyor',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:14:50Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=6b3e3c17-d75c-4789-b83d-55f74f5d041f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:98:25:60"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.804 187216 DEBUG nova.network.os_vif_util [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:98:25:60"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.805 187216 DEBUG nova.network.os_vif_util [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:25:60,bridge_name='br-int',has_traffic_filtering=True,id=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25d5d7cf-8e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.807 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <uuid>6b3e3c17-d75c-4789-b83d-55f74f5d041f</uuid>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <name>instance-00000008</name>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-515888401</nova:name>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:14:59</nova:creationTime>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:property name="hw_input_bus">usb</nova:property>
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:property name="hw_machine_type">q35</nova:property>
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:property name="hw_video_model">virtio</nova:property>
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:property name="hw_vif_model">virtio</nova:property>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:user uuid="7c561073d7c34a029574a6e2fb952944">tempest-TestExecuteActionsViaActuator-1103022868-project-admin</nova:user>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:project uuid="780511b4bf4d49299cc4d9b324261841">tempest-TestExecuteActionsViaActuator-1103022868</nova:project>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         <nova:port uuid="25d5d7cf-8ec0-4437-b51d-e3239b2b74cf">
Nov 25 19:14:59 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <system>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <entry name="serial">6b3e3c17-d75c-4789-b83d-55f74f5d041f</entry>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <entry name="uuid">6b3e3c17-d75c-4789-b83d-55f74f5d041f</entry>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     </system>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <os>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   </os>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <features>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   </features>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk.config"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:98:25:60"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <target dev="tap25d5d7cf-8e"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/console.log" append="off"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <video>
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     </video>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:14:59 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:14:59 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:14:59 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:14:59 compute-0 nova_compute[187212]: </domain>
Nov 25 19:14:59 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.808 187216 DEBUG nova.virt.libvirt.vif [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:13:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-515888401',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-515888401',id=8,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:14:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-1y0enyor',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:14:50Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=6b3e3c17-d75c-4789-b83d-55f74f5d041f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:98:25:60"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.808 187216 DEBUG nova.network.os_vif_util [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "vif_mac": "fa:16:3e:98:25:60"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.808 187216 DEBUG nova.network.os_vif_util [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:25:60,bridge_name='br-int',has_traffic_filtering=True,id=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25d5d7cf-8e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.808 187216 DEBUG os_vif [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:25:60,bridge_name='br-int',has_traffic_filtering=True,id=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25d5d7cf-8e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.809 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.809 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.810 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.811 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.811 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '37eff0dd-e91a-5385-88b2-7b0b055cf58f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.813 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.818 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.818 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25d5d7cf-8e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.819 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap25d5d7cf-8e, col_values=(('qos', UUID('161a2caa-2ad5-45cd-b502-11fe4fd78b61')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.819 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap25d5d7cf-8e, col_values=(('external_ids', {'iface-id': '25d5d7cf-8ec0-4437-b51d-e3239b2b74cf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:25:60', 'vm-uuid': '6b3e3c17-d75c-4789-b83d-55f74f5d041f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.821 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.823 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:14:59 compute-0 NetworkManager[55552]: <info>  [1764098099.8391] manager: (tap25d5d7cf-8e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.847 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.848 187216 INFO os_vif [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:25:60,bridge_name='br-int',has_traffic_filtering=True,id=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25d5d7cf-8e')
Nov 25 19:14:59 compute-0 nova_compute[187212]: 2025-11-25 19:14:59.890 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-26b4dae6-9c02-403b-b6cb-faf8fa8bb35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:15:00 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 25 19:15:00 compute-0 systemd[211894]: Activating special unit Exit the Session...
Nov 25 19:15:00 compute-0 systemd[211894]: Stopped target Main User Target.
Nov 25 19:15:00 compute-0 systemd[211894]: Stopped target Basic System.
Nov 25 19:15:00 compute-0 systemd[211894]: Stopped target Paths.
Nov 25 19:15:00 compute-0 systemd[211894]: Stopped target Sockets.
Nov 25 19:15:00 compute-0 systemd[211894]: Stopped target Timers.
Nov 25 19:15:00 compute-0 systemd[211894]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 25 19:15:00 compute-0 systemd[211894]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 25 19:15:00 compute-0 systemd[211894]: Closed D-Bus User Message Bus Socket.
Nov 25 19:15:00 compute-0 systemd[211894]: Stopped Create User's Volatile Files and Directories.
Nov 25 19:15:00 compute-0 systemd[211894]: Removed slice User Application Slice.
Nov 25 19:15:00 compute-0 systemd[211894]: Reached target Shutdown.
Nov 25 19:15:00 compute-0 systemd[211894]: Finished Exit the Session.
Nov 25 19:15:00 compute-0 systemd[211894]: Reached target Exit the Session.
Nov 25 19:15:00 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 25 19:15:00 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 25 19:15:00 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 25 19:15:00 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 25 19:15:00 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 25 19:15:00 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 25 19:15:00 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 25 19:15:00 compute-0 nova_compute[187212]: 2025-11-25 19:15:00.123 187216 DEBUG nova.network.neutron [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Updated VIF entry in instance network info cache for port 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Nov 25 19:15:00 compute-0 nova_compute[187212]: 2025-11-25 19:15:00.124 187216 DEBUG nova.network.neutron [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Updating instance_info_cache with network_info: [{"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:15:00 compute-0 nova_compute[187212]: 2025-11-25 19:15:00.419 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:00 compute-0 nova_compute[187212]: 2025-11-25 19:15:00.420 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:00 compute-0 nova_compute[187212]: 2025-11-25 19:15:00.420 187216 DEBUG oslo_concurrency.lockutils [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:00 compute-0 nova_compute[187212]: 2025-11-25 19:15:00.426 187216 INFO nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 25 19:15:00 compute-0 virtqemud[186888]: Domain id=6 name='instance-00000006' uuid=26b4dae6-9c02-403b-b6cb-faf8fa8bb35a is tainted: custom-monitor
Nov 25 19:15:00 compute-0 nova_compute[187212]: 2025-11-25 19:15:00.638 187216 DEBUG oslo_concurrency.lockutils [req-0b134cd5-f2b5-4351-82d9-19388353127a req-bd671046-507e-40df-9758-0e59bad8c97f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-6b3e3c17-d75c-4789-b83d-55f74f5d041f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:15:00 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:00.781 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:01 compute-0 podman[212056]: 2025-11-25 19:15:01.242527261 +0000 UTC m=+0.145396310 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.401 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.402 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.402 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No VIF found with MAC fa:16:3e:98:25:60, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.402 187216 INFO nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Using config drive
Nov 25 19:15:01 compute-0 openstack_network_exporter[199731]: ERROR   19:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:15:01 compute-0 openstack_network_exporter[199731]: ERROR   19:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:15:01 compute-0 openstack_network_exporter[199731]: ERROR   19:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:15:01 compute-0 openstack_network_exporter[199731]: ERROR   19:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:15:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:15:01 compute-0 openstack_network_exporter[199731]: ERROR   19:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:15:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.436 187216 INFO nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 25 19:15:01 compute-0 kernel: tap25d5d7cf-8e: entered promiscuous mode
Nov 25 19:15:01 compute-0 NetworkManager[55552]: <info>  [1764098101.5043] manager: (tap25d5d7cf-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Nov 25 19:15:01 compute-0 ovn_controller[95465]: 2025-11-25T19:15:01Z|00072|binding|INFO|Claiming lport 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf for this chassis.
Nov 25 19:15:01 compute-0 ovn_controller[95465]: 2025-11-25T19:15:01Z|00073|binding|INFO|25d5d7cf-8ec0-4437-b51d-e3239b2b74cf: Claiming fa:16:3e:98:25:60 10.100.0.7
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.506 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.516 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:25:60 10.100.0.7'], port_security=['fa:16:3e:98:25:60 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6b3e3c17-d75c-4789-b83d-55f74f5d041f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.517 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 bound to our chassis
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.518 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:15:01 compute-0 ovn_controller[95465]: 2025-11-25T19:15:01Z|00074|binding|INFO|Setting lport 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf ovn-installed in OVS
Nov 25 19:15:01 compute-0 ovn_controller[95465]: 2025-11-25T19:15:01Z|00075|binding|INFO|Setting lport 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf up in Southbound
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.535 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.541 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.544 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc76634-be9a-4ea0-b1ca-f953b77fed23]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:01 compute-0 systemd-udevd[212096]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:15:01 compute-0 NetworkManager[55552]: <info>  [1764098101.5763] device (tap25d5d7cf-8e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:15:01 compute-0 NetworkManager[55552]: <info>  [1764098101.5780] device (tap25d5d7cf-8e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:15:01 compute-0 systemd-machined[153494]: New machine qemu-7-instance-00000008.
Nov 25 19:15:01 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000008.
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.595 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[64e76004-9bd8-4170-ad54-83cb7db2bcfd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.601 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a4775b-8fef-4715-bcff-eeaad41a7542]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.651 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[7c5da069-a105-47c3-a73b-8551c4cc1084]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.672 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[d8917872-5e0e-497f-bb66-d2fc5f02b68d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 27, 'tx_packets': 13, 'rx_bytes': 1414, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 27, 'tx_packets': 13, 'rx_bytes': 1414, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 42313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212108, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.690 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[278a8445-21dd-4ff7-b99f-9fc010c827db]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212111, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212111, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.692 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.694 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:01 compute-0 nova_compute[187212]: 2025-11-25 19:15:01.696 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.696 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.697 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.697 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.698 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:01 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:01.699 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8d1d03-beb7-4936-ad8a-b2ae3d7f887e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.006 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.171 187216 DEBUG nova.compute.manager [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.176 187216 INFO nova.virt.libvirt.driver [-] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Instance running successfully.
Nov 25 19:15:02 compute-0 virtqemud[186888]: argument unsupported: QEMU guest agent is not configured
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.178 187216 DEBUG nova.virt.libvirt.guest [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.179 187216 DEBUG nova.virt.libvirt.driver [None req-c36bb73a-ca2f-4fe7-ab3d-550d50bdc9d0 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.248 187216 DEBUG nova.compute.manager [req-f4791b3f-fe02-4f22-a14f-de45a2774e69 req-fce5908f-a2cf-4226-8fdb-d351a3e6e176 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-vif-plugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.248 187216 DEBUG oslo_concurrency.lockutils [req-f4791b3f-fe02-4f22-a14f-de45a2774e69 req-fce5908f-a2cf-4226-8fdb-d351a3e6e176 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.248 187216 DEBUG oslo_concurrency.lockutils [req-f4791b3f-fe02-4f22-a14f-de45a2774e69 req-fce5908f-a2cf-4226-8fdb-d351a3e6e176 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.249 187216 DEBUG oslo_concurrency.lockutils [req-f4791b3f-fe02-4f22-a14f-de45a2774e69 req-fce5908f-a2cf-4226-8fdb-d351a3e6e176 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.249 187216 DEBUG nova.compute.manager [req-f4791b3f-fe02-4f22-a14f-de45a2774e69 req-fce5908f-a2cf-4226-8fdb-d351a3e6e176 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] No waiting events found dispatching network-vif-plugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.249 187216 WARNING nova.compute.manager [req-f4791b3f-fe02-4f22-a14f-de45a2774e69 req-fce5908f-a2cf-4226-8fdb-d351a3e6e176 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received unexpected event network-vif-plugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf for instance with vm_state active and task_state resize_finish.
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.447 187216 INFO nova.virt.libvirt.driver [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.454 187216 DEBUG nova.compute.manager [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:15:02 compute-0 nova_compute[187212]: 2025-11-25 19:15:02.967 187216 DEBUG nova.objects.instance [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.001 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.125 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.126 187216 WARNING neutronclient.v2_0.client [None req-8c518324-1f42-4236-9a64-20676028e161 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:04 compute-0 podman[212121]: 2025-11-25 19:15:04.180802744 +0000 UTC m=+0.095246528 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.359 187216 DEBUG nova.compute.manager [req-8f021a1a-8999-48d3-bf48-797a71fa90fb req-c006d0fe-ad64-4a74-8d3a-6649eec0c9f4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-vif-plugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.359 187216 DEBUG oslo_concurrency.lockutils [req-8f021a1a-8999-48d3-bf48-797a71fa90fb req-c006d0fe-ad64-4a74-8d3a-6649eec0c9f4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.360 187216 DEBUG oslo_concurrency.lockutils [req-8f021a1a-8999-48d3-bf48-797a71fa90fb req-c006d0fe-ad64-4a74-8d3a-6649eec0c9f4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.360 187216 DEBUG oslo_concurrency.lockutils [req-8f021a1a-8999-48d3-bf48-797a71fa90fb req-c006d0fe-ad64-4a74-8d3a-6649eec0c9f4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.361 187216 DEBUG nova.compute.manager [req-8f021a1a-8999-48d3-bf48-797a71fa90fb req-c006d0fe-ad64-4a74-8d3a-6649eec0c9f4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] No waiting events found dispatching network-vif-plugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.361 187216 WARNING nova.compute.manager [req-8f021a1a-8999-48d3-bf48-797a71fa90fb req-c006d0fe-ad64-4a74-8d3a-6649eec0c9f4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received unexpected event network-vif-plugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf for instance with vm_state resized and task_state None.
Nov 25 19:15:04 compute-0 nova_compute[187212]: 2025-11-25 19:15:04.821 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:07 compute-0 nova_compute[187212]: 2025-11-25 19:15:07.010 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:09 compute-0 podman[212139]: 2025-11-25 19:15:09.190751177 +0000 UTC m=+0.102849266 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 19:15:09 compute-0 nova_compute[187212]: 2025-11-25 19:15:09.825 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:12 compute-0 nova_compute[187212]: 2025-11-25 19:15:12.012 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:13 compute-0 podman[212159]: 2025-11-25 19:15:13.186265931 +0000 UTC m=+0.102876467 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Nov 25 19:15:14 compute-0 nova_compute[187212]: 2025-11-25 19:15:14.682 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:15:14 compute-0 nova_compute[187212]: 2025-11-25 19:15:14.683 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:15:14 compute-0 nova_compute[187212]: 2025-11-25 19:15:14.868 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:17 compute-0 nova_compute[187212]: 2025-11-25 19:15:17.086 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:17 compute-0 ovn_controller[95465]: 2025-11-25T19:15:17Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:98:25:60 10.100.0.7
Nov 25 19:15:19 compute-0 nova_compute[187212]: 2025-11-25 19:15:19.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:15:19 compute-0 nova_compute[187212]: 2025-11-25 19:15:19.872 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:20 compute-0 nova_compute[187212]: 2025-11-25 19:15:20.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:15:20 compute-0 nova_compute[187212]: 2025-11-25 19:15:20.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:15:20 compute-0 nova_compute[187212]: 2025-11-25 19:15:20.799 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:20 compute-0 nova_compute[187212]: 2025-11-25 19:15:20.800 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:20 compute-0 nova_compute[187212]: 2025-11-25 19:15:20.800 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:20 compute-0 nova_compute[187212]: 2025-11-25 19:15:20.800 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:15:21 compute-0 nova_compute[187212]: 2025-11-25 19:15:21.814 187216 DEBUG oslo_concurrency.lockutils [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "02775601-3840-4250-809d-622ab3cf2e99" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:21 compute-0 nova_compute[187212]: 2025-11-25 19:15:21.814 187216 DEBUG oslo_concurrency.lockutils [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:21 compute-0 nova_compute[187212]: 2025-11-25 19:15:21.814 187216 DEBUG oslo_concurrency.lockutils [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "02775601-3840-4250-809d-622ab3cf2e99-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:21 compute-0 nova_compute[187212]: 2025-11-25 19:15:21.815 187216 DEBUG oslo_concurrency.lockutils [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:21 compute-0 nova_compute[187212]: 2025-11-25 19:15:21.815 187216 DEBUG oslo_concurrency.lockutils [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:21 compute-0 nova_compute[187212]: 2025-11-25 19:15:21.914 187216 INFO nova.compute.manager [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Terminating instance
Nov 25 19:15:21 compute-0 nova_compute[187212]: 2025-11-25 19:15:21.958 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.089 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.094 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json" returned: 0 in 0.136s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.095 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.172 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.179 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.245 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.246 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.337 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.345 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.427 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.429 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.519 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.526 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.573 187216 DEBUG nova.compute.manager [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.591 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.592 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 kernel: tap03b3db32-87 (unregistering): left promiscuous mode
Nov 25 19:15:22 compute-0 NetworkManager[55552]: <info>  [1764098122.5998] device (tap03b3db32-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:15:22 compute-0 ovn_controller[95465]: 2025-11-25T19:15:22Z|00076|binding|INFO|Releasing lport 03b3db32-8760-4f8f-8c29-8fe9aba447fe from this chassis (sb_readonly=0)
Nov 25 19:15:22 compute-0 ovn_controller[95465]: 2025-11-25T19:15:22Z|00077|binding|INFO|Setting lport 03b3db32-8760-4f8f-8c29-8fe9aba447fe down in Southbound
Nov 25 19:15:22 compute-0 ovn_controller[95465]: 2025-11-25T19:15:22Z|00078|binding|INFO|Removing iface tap03b3db32-87 ovn-installed in OVS
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.623 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.638 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.664 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 25 19:15:22 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 16.277s CPU time.
Nov 25 19:15:22 compute-0 systemd-machined[153494]: Machine qemu-5-instance-00000009 terminated.
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.671 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.730 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.731 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.797 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.849 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.859 187216 INFO nova.virt.libvirt.driver [-] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Instance destroyed successfully.
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.861 187216 DEBUG nova.objects.instance [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lazy-loading 'resources' on Instance uuid 02775601-3840-4250-809d-622ab3cf2e99 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.917 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.918 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:22 compute-0 nova_compute[187212]: 2025-11-25 19:15:22.990 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.011 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:73:1f 10.100.0.8'], port_security=['fa:16:3e:7f:73:1f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '02775601-3840-4250-809d-622ab3cf2e99', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=03b3db32-8760-4f8f-8c29-8fe9aba447fe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.012 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 03b3db32-8760-4f8f-8c29-8fe9aba447fe in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 unbound from our chassis
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.014 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.034 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[3fa5f589-ebe3-4c4d-95af-68b75352df4a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.074 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf6dc10-f777-4fd6-af85-34ff58fac084]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.077 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[57d7550b-dd68-4011-8659-5549cb7e0ccf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.114 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[126773d8-bce7-4fb3-aa6d-214e476f8415]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.141 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[b3fdd00c-c6d5-452e-b0c3-1e0c6d203fcc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 35, 'tx_packets': 15, 'rx_bytes': 1750, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 35, 'tx_packets': 15, 'rx_bytes': 1750, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 42313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212255, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.167 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c8addbb6-c807-41bb-99e7-7f74ab0632d8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212256, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212256, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.169 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.173 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.178 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.179 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.179 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.180 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.181 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:23 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:23.182 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[04f9790d-372b-4f94-84cb-f1a268bccb0d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.231 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.232 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.258 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.259 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4947MB free_disk=72.8242301940918GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.259 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.259 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.891 187216 DEBUG nova.virt.libvirt.vif [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-740998153',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-740998153',id=9,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:14:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-c525g54q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:14:20Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=02775601-3840-4250-809d-622ab3cf2e99,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.892 187216 DEBUG nova.network.os_vif_util [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "address": "fa:16:3e:7f:73:1f", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03b3db32-87", "ovs_interfaceid": "03b3db32-8760-4f8f-8c29-8fe9aba447fe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.893 187216 DEBUG nova.network.os_vif_util [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:73:1f,bridge_name='br-int',has_traffic_filtering=True,id=03b3db32-8760-4f8f-8c29-8fe9aba447fe,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b3db32-87') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.893 187216 DEBUG os_vif [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:73:1f,bridge_name='br-int',has_traffic_filtering=True,id=03b3db32-8760-4f8f-8c29-8fe9aba447fe,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b3db32-87') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.895 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.896 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap03b3db32-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.938 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.941 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.942 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.942 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=90a7514d-f3a8-4faa-a7b9-832e7c0e8bdc) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.943 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.944 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.948 187216 INFO os_vif [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:73:1f,bridge_name='br-int',has_traffic_filtering=True,id=03b3db32-8760-4f8f-8c29-8fe9aba447fe,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03b3db32-87')
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.948 187216 INFO nova.virt.libvirt.driver [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Deleting instance files /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99_del
Nov 25 19:15:23 compute-0 nova_compute[187212]: 2025-11-25 19:15:23.950 187216 INFO nova.virt.libvirt.driver [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Deletion of /var/lib/nova/instances/02775601-3840-4250-809d-622ab3cf2e99_del complete
Nov 25 19:15:24 compute-0 nova_compute[187212]: 2025-11-25 19:15:24.677 187216 DEBUG nova.compute.manager [req-cadff22a-8862-428c-a943-49baacf83cce req-b1109eb3-c14f-4d21-920c-b240ecb2fd2b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Received event network-vif-unplugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:24 compute-0 nova_compute[187212]: 2025-11-25 19:15:24.678 187216 DEBUG oslo_concurrency.lockutils [req-cadff22a-8862-428c-a943-49baacf83cce req-b1109eb3-c14f-4d21-920c-b240ecb2fd2b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "02775601-3840-4250-809d-622ab3cf2e99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:24 compute-0 nova_compute[187212]: 2025-11-25 19:15:24.678 187216 DEBUG oslo_concurrency.lockutils [req-cadff22a-8862-428c-a943-49baacf83cce req-b1109eb3-c14f-4d21-920c-b240ecb2fd2b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:24 compute-0 nova_compute[187212]: 2025-11-25 19:15:24.679 187216 DEBUG oslo_concurrency.lockutils [req-cadff22a-8862-428c-a943-49baacf83cce req-b1109eb3-c14f-4d21-920c-b240ecb2fd2b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:24 compute-0 nova_compute[187212]: 2025-11-25 19:15:24.679 187216 DEBUG nova.compute.manager [req-cadff22a-8862-428c-a943-49baacf83cce req-b1109eb3-c14f-4d21-920c-b240ecb2fd2b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] No waiting events found dispatching network-vif-unplugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:24 compute-0 nova_compute[187212]: 2025-11-25 19:15:24.680 187216 DEBUG nova.compute.manager [req-cadff22a-8862-428c-a943-49baacf83cce req-b1109eb3-c14f-4d21-920c-b240ecb2fd2b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Received event network-vif-unplugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.066 187216 INFO nova.compute.manager [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Took 2.49 seconds to destroy the instance on the hypervisor.
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.067 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.067 187216 DEBUG nova.compute.manager [-] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.068 187216 DEBUG nova.network.neutron [-] [instance: 02775601-3840-4250-809d-622ab3cf2e99] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.068 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.182 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:25 compute-0 podman[212259]: 2025-11-25 19:15:25.207949961 +0000 UTC m=+0.116069297 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.302 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 909b423a-9e57-4bb8-b6b5-719b05724d71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.303 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance dd2a5303-3518-4f79-aa7b-45fc96059d01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.303 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 6fe2a300-76bb-44b4-8828-f87977451114 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.304 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 02775601-3840-4250-809d-622ab3cf2e99 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.304 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.305 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 6b3e3c17-d75c-4789-b83d-55f74f5d041f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.305 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 6 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.305 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1344MB phys_disk=79GB used_disk=6GB total_vcpus=8 used_vcpus=6 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:15:23 up  1:07,  0 user,  load average: 1.23, 0.69, 0.56\n', 'num_instances': '6', 'num_vm_active': '6', 'num_task_None': '5', 'num_os_type_None': '6', 'num_proj_780511b4bf4d49299cc4d9b324261841': '6', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.351 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.418 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.419 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.451 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.551 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:15:25 compute-0 nova_compute[187212]: 2025-11-25 19:15:25.932 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.208 187216 DEBUG nova.compute.manager [req-192255de-8ed1-499f-8798-59b62aef1897 req-5123c7ce-69b7-4508-a6fe-9e90baa072b6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Received event network-vif-deleted-03b3db32-8760-4f8f-8c29-8fe9aba447fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.209 187216 INFO nova.compute.manager [req-192255de-8ed1-499f-8798-59b62aef1897 req-5123c7ce-69b7-4508-a6fe-9e90baa072b6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Neutron deleted interface 03b3db32-8760-4f8f-8c29-8fe9aba447fe; detaching it from the instance and deleting it from the info cache
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.209 187216 DEBUG nova.network.neutron [req-192255de-8ed1-499f-8798-59b62aef1897 req-5123c7ce-69b7-4508-a6fe-9e90baa072b6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.443 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.626 187216 DEBUG nova.network.neutron [-] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.722 187216 DEBUG nova.compute.manager [req-192255de-8ed1-499f-8798-59b62aef1897 req-5123c7ce-69b7-4508-a6fe-9e90baa072b6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Detach interface failed, port_id=03b3db32-8760-4f8f-8c29-8fe9aba447fe, reason: Instance 02775601-3840-4250-809d-622ab3cf2e99 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.755 187216 DEBUG nova.compute.manager [req-fa6e6dd1-3469-44c4-a4d5-6c5d47e9067b req-175384be-9277-479e-9dce-8ddae79251a6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Received event network-vif-unplugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.756 187216 DEBUG oslo_concurrency.lockutils [req-fa6e6dd1-3469-44c4-a4d5-6c5d47e9067b req-175384be-9277-479e-9dce-8ddae79251a6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "02775601-3840-4250-809d-622ab3cf2e99-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.757 187216 DEBUG oslo_concurrency.lockutils [req-fa6e6dd1-3469-44c4-a4d5-6c5d47e9067b req-175384be-9277-479e-9dce-8ddae79251a6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.757 187216 DEBUG oslo_concurrency.lockutils [req-fa6e6dd1-3469-44c4-a4d5-6c5d47e9067b req-175384be-9277-479e-9dce-8ddae79251a6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.758 187216 DEBUG nova.compute.manager [req-fa6e6dd1-3469-44c4-a4d5-6c5d47e9067b req-175384be-9277-479e-9dce-8ddae79251a6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] No waiting events found dispatching network-vif-unplugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.758 187216 DEBUG nova.compute.manager [req-fa6e6dd1-3469-44c4-a4d5-6c5d47e9067b req-175384be-9277-479e-9dce-8ddae79251a6 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Received event network-vif-unplugged-03b3db32-8760-4f8f-8c29-8fe9aba447fe for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.957 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:15:26 compute-0 nova_compute[187212]: 2025-11-25 19:15:26.958 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.699s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:27 compute-0 nova_compute[187212]: 2025-11-25 19:15:27.093 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:27 compute-0 nova_compute[187212]: 2025-11-25 19:15:27.133 187216 INFO nova.compute.manager [-] [instance: 02775601-3840-4250-809d-622ab3cf2e99] Took 2.07 seconds to deallocate network for instance.
Nov 25 19:15:27 compute-0 nova_compute[187212]: 2025-11-25 19:15:27.656 187216 DEBUG oslo_concurrency.lockutils [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:27 compute-0 nova_compute[187212]: 2025-11-25 19:15:27.656 187216 DEBUG oslo_concurrency.lockutils [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:27 compute-0 nova_compute[187212]: 2025-11-25 19:15:27.790 187216 DEBUG nova.compute.provider_tree [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:15:28 compute-0 nova_compute[187212]: 2025-11-25 19:15:28.300 187216 DEBUG nova.scheduler.client.report [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:15:28 compute-0 nova_compute[187212]: 2025-11-25 19:15:28.811 187216 DEBUG oslo_concurrency.lockutils [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.154s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:28 compute-0 nova_compute[187212]: 2025-11-25 19:15:28.841 187216 INFO nova.scheduler.client.report [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Deleted allocations for instance 02775601-3840-4250-809d-622ab3cf2e99
Nov 25 19:15:28 compute-0 nova_compute[187212]: 2025-11-25 19:15:28.945 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:28 compute-0 nova_compute[187212]: 2025-11-25 19:15:28.954 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:15:28 compute-0 nova_compute[187212]: 2025-11-25 19:15:28.954 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:15:29 compute-0 nova_compute[187212]: 2025-11-25 19:15:29.465 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:15:29 compute-0 nova_compute[187212]: 2025-11-25 19:15:29.465 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:15:29 compute-0 nova_compute[187212]: 2025-11-25 19:15:29.466 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:15:29 compute-0 podman[197585]: time="2025-11-25T19:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:15:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:15:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3069 "" "Go-http-client/1.1"
Nov 25 19:15:29 compute-0 nova_compute[187212]: 2025-11-25 19:15:29.878 187216 DEBUG oslo_concurrency.lockutils [None req-40d63d74-b2c4-4ebc-9007-07388b81ee26 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "02775601-3840-4250-809d-622ab3cf2e99" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.064s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:30 compute-0 nova_compute[187212]: 2025-11-25 19:15:30.645 187216 DEBUG oslo_concurrency.lockutils [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:30 compute-0 nova_compute[187212]: 2025-11-25 19:15:30.646 187216 DEBUG oslo_concurrency.lockutils [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:30 compute-0 nova_compute[187212]: 2025-11-25 19:15:30.646 187216 DEBUG oslo_concurrency.lockutils [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:30 compute-0 nova_compute[187212]: 2025-11-25 19:15:30.646 187216 DEBUG oslo_concurrency.lockutils [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:30 compute-0 nova_compute[187212]: 2025-11-25 19:15:30.647 187216 DEBUG oslo_concurrency.lockutils [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:30 compute-0 nova_compute[187212]: 2025-11-25 19:15:30.664 187216 INFO nova.compute.manager [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Terminating instance
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.085 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.085 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.086 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.186 187216 DEBUG nova.compute.manager [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:15:31 compute-0 kernel: tap25d5d7cf-8e (unregistering): left promiscuous mode
Nov 25 19:15:31 compute-0 NetworkManager[55552]: <info>  [1764098131.2114] device (tap25d5d7cf-8e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:15:31 compute-0 ovn_controller[95465]: 2025-11-25T19:15:31Z|00079|binding|INFO|Releasing lport 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf from this chassis (sb_readonly=0)
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.220 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:31 compute-0 ovn_controller[95465]: 2025-11-25T19:15:31Z|00080|binding|INFO|Setting lport 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf down in Southbound
Nov 25 19:15:31 compute-0 ovn_controller[95465]: 2025-11-25T19:15:31Z|00081|binding|INFO|Removing iface tap25d5d7cf-8e ovn-installed in OVS
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.223 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.234 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:25:60 10.100.0.7'], port_security=['fa:16:3e:98:25:60 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6b3e3c17-d75c-4789-b83d-55f74f5d041f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.235 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.236 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 unbound from our chassis
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.238 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.262 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[f16685b8-a1e6-41d6-8cc5-58a53dec76f3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:31 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 25 19:15:31 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Consumed 15.320s CPU time.
Nov 25 19:15:31 compute-0 systemd-machined[153494]: Machine qemu-7-instance-00000008 terminated.
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.313 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[a68717a1-bf37-44fa-946b-e50c116e8458]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.339 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[f1760ffa-c707-47db-b08c-0e95cdb72450]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.386 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[54f27ef2-1192-40c0-9c46-d80b256c6d9b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.408 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[a212712e-5d2a-4521-9091-33791f110948]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 1792, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 17, 'rx_bytes': 1792, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 42313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212314, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:31 compute-0 openstack_network_exporter[199731]: ERROR   19:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:15:31 compute-0 openstack_network_exporter[199731]: ERROR   19:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:15:31 compute-0 openstack_network_exporter[199731]: ERROR   19:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:15:31 compute-0 openstack_network_exporter[199731]: ERROR   19:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:15:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:15:31 compute-0 openstack_network_exporter[199731]: ERROR   19:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:15:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.437 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe75ca8-ef27-48e9-bb47-1ae325a6f702]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212324, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212324, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.439 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.441 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.446 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.446 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.447 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.447 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.447 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:31.449 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[83e78c01-6781-4a89-9fcd-8c5162eff004]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.461 187216 INFO nova.virt.libvirt.driver [-] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Instance destroyed successfully.
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.461 187216 DEBUG nova.objects.instance [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lazy-loading 'resources' on Instance uuid 6b3e3c17-d75c-4789-b83d-55f74f5d041f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:15:31 compute-0 podman[212287]: 2025-11-25 19:15:31.465501934 +0000 UTC m=+0.177586930 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.490 187216 DEBUG nova.compute.manager [req-9bb78345-8786-43a2-bcb8-f8bb1d65fd7d req-12df481e-5fa6-4a40-87ea-540bdf093a59 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.490 187216 DEBUG oslo_concurrency.lockutils [req-9bb78345-8786-43a2-bcb8-f8bb1d65fd7d req-12df481e-5fa6-4a40-87ea-540bdf093a59 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.490 187216 DEBUG oslo_concurrency.lockutils [req-9bb78345-8786-43a2-bcb8-f8bb1d65fd7d req-12df481e-5fa6-4a40-87ea-540bdf093a59 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.490 187216 DEBUG oslo_concurrency.lockutils [req-9bb78345-8786-43a2-bcb8-f8bb1d65fd7d req-12df481e-5fa6-4a40-87ea-540bdf093a59 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.491 187216 DEBUG nova.compute.manager [req-9bb78345-8786-43a2-bcb8-f8bb1d65fd7d req-12df481e-5fa6-4a40-87ea-540bdf093a59 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] No waiting events found dispatching network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.491 187216 DEBUG nova.compute.manager [req-9bb78345-8786-43a2-bcb8-f8bb1d65fd7d req-12df481e-5fa6-4a40-87ea-540bdf093a59 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.970 187216 DEBUG nova.virt.libvirt.vif [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:13:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-515888401',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-515888401',id=8,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:15:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-1y0enyor',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:15:15Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=6b3e3c17-d75c-4789-b83d-55f74f5d041f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.970 187216 DEBUG nova.network.os_vif_util [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "address": "fa:16:3e:98:25:60", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25d5d7cf-8e", "ovs_interfaceid": "25d5d7cf-8ec0-4437-b51d-e3239b2b74cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.971 187216 DEBUG nova.network.os_vif_util [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:25:60,bridge_name='br-int',has_traffic_filtering=True,id=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25d5d7cf-8e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.971 187216 DEBUG os_vif [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:25:60,bridge_name='br-int',has_traffic_filtering=True,id=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25d5d7cf-8e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.972 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.973 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25d5d7cf-8e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.994 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.996 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.997 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.998 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=161a2caa-2ad5-45cd-b502-11fe4fd78b61) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:31 compute-0 nova_compute[187212]: 2025-11-25 19:15:31.998 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.000 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.002 187216 INFO os_vif [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:25:60,bridge_name='br-int',has_traffic_filtering=True,id=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25d5d7cf-8e')
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.003 187216 INFO nova.virt.libvirt.driver [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Deleting instance files /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f_del
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.006 187216 INFO nova.virt.libvirt.driver [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Deletion of /var/lib/nova/instances/6b3e3c17-d75c-4789-b83d-55f74f5d041f_del complete
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.094 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.518 187216 INFO nova.compute.manager [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Took 1.33 seconds to destroy the instance on the hypervisor.
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.518 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.519 187216 DEBUG nova.compute.manager [-] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.519 187216 DEBUG nova.network.neutron [-] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:15:32 compute-0 nova_compute[187212]: 2025-11-25 19:15:32.520 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.218 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.548 187216 DEBUG nova.compute.manager [req-7803214f-e85e-42c0-a378-67c314a3a223 req-6113f742-5904-42e7-a84b-db99a1156109 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-vif-deleted-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.549 187216 INFO nova.compute.manager [req-7803214f-e85e-42c0-a378-67c314a3a223 req-6113f742-5904-42e7-a84b-db99a1156109 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Neutron deleted interface 25d5d7cf-8ec0-4437-b51d-e3239b2b74cf; detaching it from the instance and deleting it from the info cache
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.549 187216 DEBUG nova.network.neutron [req-7803214f-e85e-42c0-a378-67c314a3a223 req-6113f742-5904-42e7-a84b-db99a1156109 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.629 187216 DEBUG nova.compute.manager [req-da7ab01b-18ba-45a9-8a9d-ce8661416529 req-eb81630a-0786-4b25-b1c5-abc010efc508 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.630 187216 DEBUG oslo_concurrency.lockutils [req-da7ab01b-18ba-45a9-8a9d-ce8661416529 req-eb81630a-0786-4b25-b1c5-abc010efc508 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.630 187216 DEBUG oslo_concurrency.lockutils [req-da7ab01b-18ba-45a9-8a9d-ce8661416529 req-eb81630a-0786-4b25-b1c5-abc010efc508 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.630 187216 DEBUG oslo_concurrency.lockutils [req-da7ab01b-18ba-45a9-8a9d-ce8661416529 req-eb81630a-0786-4b25-b1c5-abc010efc508 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.631 187216 DEBUG nova.compute.manager [req-da7ab01b-18ba-45a9-8a9d-ce8661416529 req-eb81630a-0786-4b25-b1c5-abc010efc508 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] No waiting events found dispatching network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.631 187216 DEBUG nova.compute.manager [req-da7ab01b-18ba-45a9-8a9d-ce8661416529 req-eb81630a-0786-4b25-b1c5-abc010efc508 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Received event network-vif-unplugged-25d5d7cf-8ec0-4437-b51d-e3239b2b74cf for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:15:33 compute-0 nova_compute[187212]: 2025-11-25 19:15:33.993 187216 DEBUG nova.network.neutron [-] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:15:34 compute-0 nova_compute[187212]: 2025-11-25 19:15:34.061 187216 DEBUG nova.compute.manager [req-7803214f-e85e-42c0-a378-67c314a3a223 req-6113f742-5904-42e7-a84b-db99a1156109 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Detach interface failed, port_id=25d5d7cf-8ec0-4437-b51d-e3239b2b74cf, reason: Instance 6b3e3c17-d75c-4789-b83d-55f74f5d041f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:15:34 compute-0 nova_compute[187212]: 2025-11-25 19:15:34.501 187216 INFO nova.compute.manager [-] [instance: 6b3e3c17-d75c-4789-b83d-55f74f5d041f] Took 1.98 seconds to deallocate network for instance.
Nov 25 19:15:35 compute-0 nova_compute[187212]: 2025-11-25 19:15:35.023 187216 DEBUG oslo_concurrency.lockutils [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:35 compute-0 nova_compute[187212]: 2025-11-25 19:15:35.024 187216 DEBUG oslo_concurrency.lockutils [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:35 compute-0 nova_compute[187212]: 2025-11-25 19:15:35.138 187216 DEBUG nova.compute.provider_tree [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:15:35 compute-0 podman[212340]: 2025-11-25 19:15:35.168810173 +0000 UTC m=+0.071856358 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Nov 25 19:15:35 compute-0 nova_compute[187212]: 2025-11-25 19:15:35.647 187216 DEBUG nova.scheduler.client.report [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:15:36 compute-0 nova_compute[187212]: 2025-11-25 19:15:36.159 187216 DEBUG oslo_concurrency.lockutils [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.135s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:36 compute-0 nova_compute[187212]: 2025-11-25 19:15:36.237 187216 INFO nova.scheduler.client.report [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Deleted allocations for instance 6b3e3c17-d75c-4789-b83d-55f74f5d041f
Nov 25 19:15:37 compute-0 nova_compute[187212]: 2025-11-25 19:15:37.000 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:37 compute-0 nova_compute[187212]: 2025-11-25 19:15:37.097 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:37 compute-0 nova_compute[187212]: 2025-11-25 19:15:37.404 187216 DEBUG oslo_concurrency.lockutils [None req-9d11e4db-65ed-4da8-aa43-493d82209b65 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6b3e3c17-d75c-4789-b83d-55f74f5d041f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.758s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:39 compute-0 nova_compute[187212]: 2025-11-25 19:15:39.946 187216 DEBUG oslo_concurrency.lockutils [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "6fe2a300-76bb-44b4-8828-f87977451114" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:39 compute-0 nova_compute[187212]: 2025-11-25 19:15:39.946 187216 DEBUG oslo_concurrency.lockutils [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:39 compute-0 nova_compute[187212]: 2025-11-25 19:15:39.947 187216 DEBUG oslo_concurrency.lockutils [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "6fe2a300-76bb-44b4-8828-f87977451114-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:39 compute-0 nova_compute[187212]: 2025-11-25 19:15:39.947 187216 DEBUG oslo_concurrency.lockutils [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:39 compute-0 nova_compute[187212]: 2025-11-25 19:15:39.948 187216 DEBUG oslo_concurrency.lockutils [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:39 compute-0 nova_compute[187212]: 2025-11-25 19:15:39.963 187216 INFO nova.compute.manager [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Terminating instance
Nov 25 19:15:40 compute-0 podman[212360]: 2025-11-25 19:15:40.183993881 +0000 UTC m=+0.095494989 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Nov 25 19:15:40 compute-0 nova_compute[187212]: 2025-11-25 19:15:40.483 187216 DEBUG nova.compute.manager [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:15:40 compute-0 kernel: tap5cceedef-39 (unregistering): left promiscuous mode
Nov 25 19:15:40 compute-0 NetworkManager[55552]: <info>  [1764098140.5068] device (tap5cceedef-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:15:40 compute-0 nova_compute[187212]: 2025-11-25 19:15:40.511 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:40 compute-0 ovn_controller[95465]: 2025-11-25T19:15:40Z|00082|binding|INFO|Releasing lport 5cceedef-39bc-43df-be34-b65a3f0dd6b1 from this chassis (sb_readonly=0)
Nov 25 19:15:40 compute-0 ovn_controller[95465]: 2025-11-25T19:15:40Z|00083|binding|INFO|Setting lport 5cceedef-39bc-43df-be34-b65a3f0dd6b1 down in Southbound
Nov 25 19:15:40 compute-0 ovn_controller[95465]: 2025-11-25T19:15:40Z|00084|binding|INFO|Removing iface tap5cceedef-39 ovn-installed in OVS
Nov 25 19:15:40 compute-0 nova_compute[187212]: 2025-11-25 19:15:40.515 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:40 compute-0 nova_compute[187212]: 2025-11-25 19:15:40.544 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:40 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 25 19:15:40 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 20.260s CPU time.
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.560 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:b8:85 10.100.0.13'], port_security=['fa:16:3e:cb:b8:85 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6fe2a300-76bb-44b4-8828-f87977451114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=5cceedef-39bc-43df-be34-b65a3f0dd6b1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.562 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 5cceedef-39bc-43df-be34-b65a3f0dd6b1 in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 unbound from our chassis
Nov 25 19:15:40 compute-0 systemd-machined[153494]: Machine qemu-4-instance-00000007 terminated.
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.564 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.580 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[5559cbde-dc44-45b5-9a56-2f976a486c6f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.620 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[42f56bd5-6c29-478b-853d-52fb228e5894]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.623 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[a3dfe568-77d6-4c52-9f1f-f11ae91f0097]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.652 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[15b6ef80-e159-4680-acd5-a208cee5c144]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.672 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[0fbec706-8781-4b8a-94f6-fd5a3b4cce97]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 1792, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 1792, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 42313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212394, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.694 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[4571d5f3-0acf-4895-b875-63d161e0a41b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212395, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212395, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.696 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:40 compute-0 nova_compute[187212]: 2025-11-25 19:15:40.733 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:40 compute-0 nova_compute[187212]: 2025-11-25 19:15:40.745 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.746 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.747 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.747 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.747 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:40 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:40.750 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[061dfa60-137c-4929-b8cb-d9997f9678cc]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:40 compute-0 nova_compute[187212]: 2025-11-25 19:15:40.778 187216 INFO nova.virt.libvirt.driver [-] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Instance destroyed successfully.
Nov 25 19:15:40 compute-0 nova_compute[187212]: 2025-11-25 19:15:40.779 187216 DEBUG nova.objects.instance [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lazy-loading 'resources' on Instance uuid 6fe2a300-76bb-44b4-8828-f87977451114 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.339 187216 DEBUG nova.virt.libvirt.vif [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:13:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1631407178',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1631407178',id=7,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:13:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-8kkhqkbd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:13:36Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=6fe2a300-76bb-44b4-8828-f87977451114,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.339 187216 DEBUG nova.network.os_vif_util [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "address": "fa:16:3e:cb:b8:85", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5cceedef-39", "ovs_interfaceid": "5cceedef-39bc-43df-be34-b65a3f0dd6b1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.340 187216 DEBUG nova.network.os_vif_util [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:b8:85,bridge_name='br-int',has_traffic_filtering=True,id=5cceedef-39bc-43df-be34-b65a3f0dd6b1,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cceedef-39') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.341 187216 DEBUG os_vif [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:b8:85,bridge_name='br-int',has_traffic_filtering=True,id=5cceedef-39bc-43df-be34-b65a3f0dd6b1,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cceedef-39') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.343 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.343 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5cceedef-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.345 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.347 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.348 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.349 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=43723900-9c38-4267-91ea-0251fc489e32) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.350 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.351 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.356 187216 DEBUG nova.compute.manager [req-92325201-4c61-4e4e-95b7-fa1ef4a349dc req-b5c9e67c-a958-41d0-af23-171e8743215c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Received event network-vif-unplugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.356 187216 DEBUG oslo_concurrency.lockutils [req-92325201-4c61-4e4e-95b7-fa1ef4a349dc req-b5c9e67c-a958-41d0-af23-171e8743215c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6fe2a300-76bb-44b4-8828-f87977451114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.357 187216 DEBUG oslo_concurrency.lockutils [req-92325201-4c61-4e4e-95b7-fa1ef4a349dc req-b5c9e67c-a958-41d0-af23-171e8743215c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.358 187216 DEBUG oslo_concurrency.lockutils [req-92325201-4c61-4e4e-95b7-fa1ef4a349dc req-b5c9e67c-a958-41d0-af23-171e8743215c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.358 187216 DEBUG nova.compute.manager [req-92325201-4c61-4e4e-95b7-fa1ef4a349dc req-b5c9e67c-a958-41d0-af23-171e8743215c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] No waiting events found dispatching network-vif-unplugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.358 187216 DEBUG nova.compute.manager [req-92325201-4c61-4e4e-95b7-fa1ef4a349dc req-b5c9e67c-a958-41d0-af23-171e8743215c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Received event network-vif-unplugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.360 187216 INFO os_vif [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:b8:85,bridge_name='br-int',has_traffic_filtering=True,id=5cceedef-39bc-43df-be34-b65a3f0dd6b1,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5cceedef-39')
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.360 187216 INFO nova.virt.libvirt.driver [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Deleting instance files /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114_del
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.361 187216 INFO nova.virt.libvirt.driver [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Deletion of /var/lib/nova/instances/6fe2a300-76bb-44b4-8828-f87977451114_del complete
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.879 187216 INFO nova.compute.manager [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Took 1.39 seconds to destroy the instance on the hypervisor.
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.880 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.880 187216 DEBUG nova.compute.manager [-] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.881 187216 DEBUG nova.network.neutron [-] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:15:41 compute-0 nova_compute[187212]: 2025-11-25 19:15:41.881 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:42 compute-0 nova_compute[187212]: 2025-11-25 19:15:42.100 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:42 compute-0 nova_compute[187212]: 2025-11-25 19:15:42.231 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:43 compute-0 nova_compute[187212]: 2025-11-25 19:15:43.107 187216 DEBUG nova.network.neutron [-] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:15:43 compute-0 nova_compute[187212]: 2025-11-25 19:15:43.484 187216 DEBUG nova.compute.manager [req-0177b69a-60ff-4164-8bed-d035b5a7c16e req-8180feef-29bf-463a-8b79-2f4b789719f5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Received event network-vif-unplugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:43 compute-0 nova_compute[187212]: 2025-11-25 19:15:43.484 187216 DEBUG oslo_concurrency.lockutils [req-0177b69a-60ff-4164-8bed-d035b5a7c16e req-8180feef-29bf-463a-8b79-2f4b789719f5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "6fe2a300-76bb-44b4-8828-f87977451114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:43 compute-0 nova_compute[187212]: 2025-11-25 19:15:43.485 187216 DEBUG oslo_concurrency.lockutils [req-0177b69a-60ff-4164-8bed-d035b5a7c16e req-8180feef-29bf-463a-8b79-2f4b789719f5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:43 compute-0 nova_compute[187212]: 2025-11-25 19:15:43.485 187216 DEBUG oslo_concurrency.lockutils [req-0177b69a-60ff-4164-8bed-d035b5a7c16e req-8180feef-29bf-463a-8b79-2f4b789719f5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:43 compute-0 nova_compute[187212]: 2025-11-25 19:15:43.485 187216 DEBUG nova.compute.manager [req-0177b69a-60ff-4164-8bed-d035b5a7c16e req-8180feef-29bf-463a-8b79-2f4b789719f5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] No waiting events found dispatching network-vif-unplugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:43 compute-0 nova_compute[187212]: 2025-11-25 19:15:43.486 187216 DEBUG nova.compute.manager [req-0177b69a-60ff-4164-8bed-d035b5a7c16e req-8180feef-29bf-463a-8b79-2f4b789719f5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Received event network-vif-unplugged-5cceedef-39bc-43df-be34-b65a3f0dd6b1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:15:43 compute-0 nova_compute[187212]: 2025-11-25 19:15:43.486 187216 DEBUG nova.compute.manager [req-0177b69a-60ff-4164-8bed-d035b5a7c16e req-8180feef-29bf-463a-8b79-2f4b789719f5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Received event network-vif-deleted-5cceedef-39bc-43df-be34-b65a3f0dd6b1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:43 compute-0 nova_compute[187212]: 2025-11-25 19:15:43.625 187216 INFO nova.compute.manager [-] [instance: 6fe2a300-76bb-44b4-8828-f87977451114] Took 1.74 seconds to deallocate network for instance.
Nov 25 19:15:44 compute-0 nova_compute[187212]: 2025-11-25 19:15:44.146 187216 DEBUG oslo_concurrency.lockutils [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:44 compute-0 nova_compute[187212]: 2025-11-25 19:15:44.148 187216 DEBUG oslo_concurrency.lockutils [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:44 compute-0 podman[212411]: 2025-11-25 19:15:44.17120569 +0000 UTC m=+0.092118497 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:15:44 compute-0 nova_compute[187212]: 2025-11-25 19:15:44.255 187216 DEBUG nova.compute.provider_tree [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:15:44 compute-0 nova_compute[187212]: 2025-11-25 19:15:44.763 187216 DEBUG nova.scheduler.client.report [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:15:45 compute-0 nova_compute[187212]: 2025-11-25 19:15:45.277 187216 DEBUG oslo_concurrency.lockutils [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.129s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:45 compute-0 nova_compute[187212]: 2025-11-25 19:15:45.301 187216 INFO nova.scheduler.client.report [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Deleted allocations for instance 6fe2a300-76bb-44b4-8828-f87977451114
Nov 25 19:15:46 compute-0 nova_compute[187212]: 2025-11-25 19:15:46.333 187216 DEBUG oslo_concurrency.lockutils [None req-b655c94e-7610-4ae6-bd5f-d3b203050e4c 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "6fe2a300-76bb-44b4-8828-f87977451114" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.386s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:46 compute-0 nova_compute[187212]: 2025-11-25 19:15:46.398 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:47 compute-0 nova_compute[187212]: 2025-11-25 19:15:47.102 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:47 compute-0 nova_compute[187212]: 2025-11-25 19:15:47.720 187216 DEBUG oslo_concurrency.lockutils [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:47 compute-0 nova_compute[187212]: 2025-11-25 19:15:47.721 187216 DEBUG oslo_concurrency.lockutils [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:47 compute-0 nova_compute[187212]: 2025-11-25 19:15:47.721 187216 DEBUG oslo_concurrency.lockutils [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:47 compute-0 nova_compute[187212]: 2025-11-25 19:15:47.722 187216 DEBUG oslo_concurrency.lockutils [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:47 compute-0 nova_compute[187212]: 2025-11-25 19:15:47.722 187216 DEBUG oslo_concurrency.lockutils [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:47 compute-0 nova_compute[187212]: 2025-11-25 19:15:47.738 187216 INFO nova.compute.manager [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Terminating instance
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.254 187216 DEBUG nova.compute.manager [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:15:48 compute-0 kernel: tap1d533527-09 (unregistering): left promiscuous mode
Nov 25 19:15:48 compute-0 NetworkManager[55552]: <info>  [1764098148.2857] device (tap1d533527-09): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:15:48 compute-0 ovn_controller[95465]: 2025-11-25T19:15:48Z|00085|binding|INFO|Releasing lport 1d533527-0957-4c0b-893c-65f597515760 from this chassis (sb_readonly=0)
Nov 25 19:15:48 compute-0 ovn_controller[95465]: 2025-11-25T19:15:48Z|00086|binding|INFO|Setting lport 1d533527-0957-4c0b-893c-65f597515760 down in Southbound
Nov 25 19:15:48 compute-0 ovn_controller[95465]: 2025-11-25T19:15:48Z|00087|binding|INFO|Removing iface tap1d533527-09 ovn-installed in OVS
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.289 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.292 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.299 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:b0:8d 10.100.0.4'], port_security=['fa:16:3e:53:b0:8d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '26b4dae6-9c02-403b-b6cb-faf8fa8bb35a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=1d533527-0957-4c0b-893c-65f597515760) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.300 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 1d533527-0957-4c0b-893c-65f597515760 in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 unbound from our chassis
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.307 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.317 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.333 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[60d4f589-f2a1-4eef-94a5-0bddf2d956af]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:48 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 25 19:15:48 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 4.143s CPU time.
Nov 25 19:15:48 compute-0 systemd-machined[153494]: Machine qemu-6-instance-00000006 terminated.
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.383 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef5ea7c-9d50-491a-a08c-1a74ebab7822]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.387 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[5d609090-7209-47da-b9cd-2941ebda5143]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.432 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[8e222f78-d5ec-4152-900a-dc2c7e76bc1c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.464 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[a92d4212-1310-4421-9da9-9fcfda035a3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 1792, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 1792, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 42313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212445, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.472 187216 DEBUG nova.compute.manager [req-ecfe0864-bae4-41f2-914b-15104a3d7133 req-7c2b89e6-1686-49cc-b823-db695aa316c3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Received event network-vif-unplugged-1d533527-0957-4c0b-893c-65f597515760 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.472 187216 DEBUG oslo_concurrency.lockutils [req-ecfe0864-bae4-41f2-914b-15104a3d7133 req-7c2b89e6-1686-49cc-b823-db695aa316c3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.473 187216 DEBUG oslo_concurrency.lockutils [req-ecfe0864-bae4-41f2-914b-15104a3d7133 req-7c2b89e6-1686-49cc-b823-db695aa316c3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.473 187216 DEBUG oslo_concurrency.lockutils [req-ecfe0864-bae4-41f2-914b-15104a3d7133 req-7c2b89e6-1686-49cc-b823-db695aa316c3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.473 187216 DEBUG nova.compute.manager [req-ecfe0864-bae4-41f2-914b-15104a3d7133 req-7c2b89e6-1686-49cc-b823-db695aa316c3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] No waiting events found dispatching network-vif-unplugged-1d533527-0957-4c0b-893c-65f597515760 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.474 187216 DEBUG nova.compute.manager [req-ecfe0864-bae4-41f2-914b-15104a3d7133 req-7c2b89e6-1686-49cc-b823-db695aa316c3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Received event network-vif-unplugged-1d533527-0957-4c0b-893c-65f597515760 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.484 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.491 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.491 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[0a922cd1-7828-4386-be18-fdbb7480bcc8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212447, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212447, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.493 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.495 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:48 compute-0 nova_compute[187212]: 2025-11-25 19:15:48.500 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.501 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.501 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.502 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.502 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:48.504 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c309386e-9127-4465-8f21-99e579a69f2e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.423 187216 INFO nova.virt.libvirt.driver [-] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Instance destroyed successfully.
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.424 187216 DEBUG nova.objects.instance [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lazy-loading 'resources' on Instance uuid 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.937 187216 DEBUG nova.virt.libvirt.vif [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2025-11-25T19:12:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-754987485',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-754987485',id=6,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:13:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-blialy70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',clean_attempts='1',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:15:03Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=26b4dae6-9c02-403b-b6cb-faf8fa8bb35a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d533527-0957-4c0b-893c-65f597515760", "address": "fa:16:3e:53:b0:8d", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d533527-09", "ovs_interfaceid": "1d533527-0957-4c0b-893c-65f597515760", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.938 187216 DEBUG nova.network.os_vif_util [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "1d533527-0957-4c0b-893c-65f597515760", "address": "fa:16:3e:53:b0:8d", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d533527-09", "ovs_interfaceid": "1d533527-0957-4c0b-893c-65f597515760", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.939 187216 DEBUG nova.network.os_vif_util [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:53:b0:8d,bridge_name='br-int',has_traffic_filtering=True,id=1d533527-0957-4c0b-893c-65f597515760,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d533527-09') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.939 187216 DEBUG os_vif [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:b0:8d,bridge_name='br-int',has_traffic_filtering=True,id=1d533527-0957-4c0b-893c-65f597515760,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d533527-09') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.942 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.942 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d533527-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.944 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.947 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.948 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.949 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=15e76350-7c5e-41b7-b005-0e0887c59dd4) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.950 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.951 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.954 187216 INFO os_vif [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:b0:8d,bridge_name='br-int',has_traffic_filtering=True,id=1d533527-0957-4c0b-893c-65f597515760,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d533527-09')
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.955 187216 INFO nova.virt.libvirt.driver [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Deleting instance files /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a_del
Nov 25 19:15:49 compute-0 nova_compute[187212]: 2025-11-25 19:15:49.956 187216 INFO nova.virt.libvirt.driver [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Deletion of /var/lib/nova/instances/26b4dae6-9c02-403b-b6cb-faf8fa8bb35a_del complete
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.472 187216 INFO nova.compute.manager [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Took 2.22 seconds to destroy the instance on the hypervisor.
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.473 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.474 187216 DEBUG nova.compute.manager [-] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.474 187216 DEBUG nova.network.neutron [-] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.474 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.579 187216 DEBUG nova.compute.manager [req-99420ed2-cb07-4c84-9ed6-4e86b6d239ba req-641b363b-36b1-4452-909c-2295c7c433b9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Received event network-vif-unplugged-1d533527-0957-4c0b-893c-65f597515760 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.580 187216 DEBUG oslo_concurrency.lockutils [req-99420ed2-cb07-4c84-9ed6-4e86b6d239ba req-641b363b-36b1-4452-909c-2295c7c433b9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.580 187216 DEBUG oslo_concurrency.lockutils [req-99420ed2-cb07-4c84-9ed6-4e86b6d239ba req-641b363b-36b1-4452-909c-2295c7c433b9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.580 187216 DEBUG oslo_concurrency.lockutils [req-99420ed2-cb07-4c84-9ed6-4e86b6d239ba req-641b363b-36b1-4452-909c-2295c7c433b9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.581 187216 DEBUG nova.compute.manager [req-99420ed2-cb07-4c84-9ed6-4e86b6d239ba req-641b363b-36b1-4452-909c-2295c7c433b9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] No waiting events found dispatching network-vif-unplugged-1d533527-0957-4c0b-893c-65f597515760 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.581 187216 DEBUG nova.compute.manager [req-99420ed2-cb07-4c84-9ed6-4e86b6d239ba req-641b363b-36b1-4452-909c-2295c7c433b9 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Received event network-vif-unplugged-1d533527-0957-4c0b-893c-65f597515760 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:15:50 compute-0 nova_compute[187212]: 2025-11-25 19:15:50.609 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:51 compute-0 nova_compute[187212]: 2025-11-25 19:15:51.939 187216 DEBUG nova.network.neutron [-] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:15:52 compute-0 nova_compute[187212]: 2025-11-25 19:15:52.104 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:52 compute-0 nova_compute[187212]: 2025-11-25 19:15:52.447 187216 INFO nova.compute.manager [-] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Took 1.97 seconds to deallocate network for instance.
Nov 25 19:15:52 compute-0 nova_compute[187212]: 2025-11-25 19:15:52.644 187216 DEBUG nova.compute.manager [req-baaca4e0-6255-473c-876a-091729599d37 req-d06f603b-9271-4ca5-a106-5bc01545a2bd 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a] Received event network-vif-deleted-1d533527-0957-4c0b-893c-65f597515760 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:52 compute-0 nova_compute[187212]: 2025-11-25 19:15:52.970 187216 DEBUG oslo_concurrency.lockutils [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:52 compute-0 nova_compute[187212]: 2025-11-25 19:15:52.971 187216 DEBUG oslo_concurrency.lockutils [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:53 compute-0 nova_compute[187212]: 2025-11-25 19:15:53.111 187216 DEBUG nova.compute.provider_tree [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:15:53 compute-0 nova_compute[187212]: 2025-11-25 19:15:53.619 187216 DEBUG nova.scheduler.client.report [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:15:54 compute-0 nova_compute[187212]: 2025-11-25 19:15:54.129 187216 DEBUG oslo_concurrency.lockutils [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.158s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:54 compute-0 nova_compute[187212]: 2025-11-25 19:15:54.172 187216 INFO nova.scheduler.client.report [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Deleted allocations for instance 26b4dae6-9c02-403b-b6cb-faf8fa8bb35a
Nov 25 19:15:54 compute-0 nova_compute[187212]: 2025-11-25 19:15:54.950 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:55 compute-0 nova_compute[187212]: 2025-11-25 19:15:55.211 187216 DEBUG oslo_concurrency.lockutils [None req-6fd4b5ec-594f-4ba0-bbe6-150820265901 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "26b4dae6-9c02-403b-b6cb-faf8fa8bb35a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.490s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:56 compute-0 podman[212464]: 2025-11-25 19:15:56.163707899 +0000 UTC m=+0.084307553 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.106 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.139 187216 DEBUG oslo_concurrency.lockutils [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "909b423a-9e57-4bb8-b6b5-719b05724d71" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.140 187216 DEBUG oslo_concurrency.lockutils [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.140 187216 DEBUG oslo_concurrency.lockutils [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.141 187216 DEBUG oslo_concurrency.lockutils [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.141 187216 DEBUG oslo_concurrency.lockutils [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.156 187216 INFO nova.compute.manager [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Terminating instance
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.675 187216 DEBUG nova.compute.manager [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:15:57 compute-0 kernel: tape568cb76-eb (unregistering): left promiscuous mode
Nov 25 19:15:57 compute-0 NetworkManager[55552]: <info>  [1764098157.7491] device (tape568cb76-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.760 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:57 compute-0 ovn_controller[95465]: 2025-11-25T19:15:57Z|00088|binding|INFO|Releasing lport e568cb76-eb81-4449-aed6-d84ad4a0f086 from this chassis (sb_readonly=0)
Nov 25 19:15:57 compute-0 ovn_controller[95465]: 2025-11-25T19:15:57Z|00089|binding|INFO|Setting lport e568cb76-eb81-4449-aed6-d84ad4a0f086 down in Southbound
Nov 25 19:15:57 compute-0 ovn_controller[95465]: 2025-11-25T19:15:57Z|00090|binding|INFO|Removing iface tape568cb76-eb ovn-installed in OVS
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.766 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.773 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:ba:48 10.100.0.5'], port_security=['fa:16:3e:82:ba:48 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '909b423a-9e57-4bb8-b6b5-719b05724d71', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=e568cb76-eb81-4449-aed6-d84ad4a0f086) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.775 104356 INFO neutron.agent.ovn.metadata.agent [-] Port e568cb76-eb81-4449-aed6-d84ad4a0f086 in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 unbound from our chassis
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.777 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.787 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.801 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[46fce403-c6ca-430a-ad8e-44c48079b8be]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:57 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 25 19:15:57 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 24.799s CPU time.
Nov 25 19:15:57 compute-0 systemd-machined[153494]: Machine qemu-2-instance-00000005 terminated.
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.850 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[e322794c-b9f4-4d3b-acec-970c9cd31ad9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.854 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[91aa8584-4aa3-4c87-beb4-7c467eb192c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.897 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5383ba-0528-4970-88fc-b437bf4e83fc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.945 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.956 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[a58711c2-6c3c-4dd7-9b3f-72f595807859]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22e324dc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:f9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 1792, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 1792, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387728, 'reachable_time': 42313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212503, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.971 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[8d9ed20c-955c-4d89-bfd4-bf7113b4580f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387743, 'tstamp': 387743}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212510, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22e324dc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387748, 'tstamp': 387748}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212510, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.972 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.973 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.977 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.977 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22e324dc-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.977 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.977 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22e324dc-30, col_values=(('external_ids', {'iface-id': 'a599677f-a9c8-4759-a6d8-6e08d6b4e0d1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.978 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:15:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:57.979 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[5c57ac6d-d003-460a-9abd-c437ccf02b31]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22e324dc-3f92-4b1c-b9f6-81cfabbc2783\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.989 187216 INFO nova.virt.libvirt.driver [-] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Instance destroyed successfully.
Nov 25 19:15:57 compute-0 nova_compute[187212]: 2025-11-25 19:15:57.989 187216 DEBUG nova.objects.instance [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lazy-loading 'resources' on Instance uuid 909b423a-9e57-4bb8-b6b5-719b05724d71 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.338 187216 DEBUG nova.compute.manager [req-cf56bb7b-6ec4-437a-aefa-188d35868399 req-99376d29-1724-4217-98bd-a2c45ff60cef 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Received event network-vif-unplugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.339 187216 DEBUG oslo_concurrency.lockutils [req-cf56bb7b-6ec4-437a-aefa-188d35868399 req-99376d29-1724-4217-98bd-a2c45ff60cef 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.339 187216 DEBUG oslo_concurrency.lockutils [req-cf56bb7b-6ec4-437a-aefa-188d35868399 req-99376d29-1724-4217-98bd-a2c45ff60cef 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.339 187216 DEBUG oslo_concurrency.lockutils [req-cf56bb7b-6ec4-437a-aefa-188d35868399 req-99376d29-1724-4217-98bd-a2c45ff60cef 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.340 187216 DEBUG nova.compute.manager [req-cf56bb7b-6ec4-437a-aefa-188d35868399 req-99376d29-1724-4217-98bd-a2c45ff60cef 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] No waiting events found dispatching network-vif-unplugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.340 187216 DEBUG nova.compute.manager [req-cf56bb7b-6ec4-437a-aefa-188d35868399 req-99376d29-1724-4217-98bd-a2c45ff60cef 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Received event network-vif-unplugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.496 187216 DEBUG nova.virt.libvirt.vif [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:11:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-827340657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-827340657',id=5,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:12:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-zf07q39k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:12:11Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=909b423a-9e57-4bb8-b6b5-719b05724d71,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.497 187216 DEBUG nova.network.os_vif_util [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "address": "fa:16:3e:82:ba:48", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape568cb76-eb", "ovs_interfaceid": "e568cb76-eb81-4449-aed6-d84ad4a0f086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.498 187216 DEBUG nova.network.os_vif_util [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:ba:48,bridge_name='br-int',has_traffic_filtering=True,id=e568cb76-eb81-4449-aed6-d84ad4a0f086,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape568cb76-eb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.499 187216 DEBUG os_vif [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:ba:48,bridge_name='br-int',has_traffic_filtering=True,id=e568cb76-eb81-4449-aed6-d84ad4a0f086,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape568cb76-eb') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.501 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.501 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape568cb76-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.503 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.506 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.506 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.507 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e7a644f5-e5d1-4aaf-9fc5-c769895e04a8) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.508 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.509 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.511 187216 INFO os_vif [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:ba:48,bridge_name='br-int',has_traffic_filtering=True,id=e568cb76-eb81-4449-aed6-d84ad4a0f086,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape568cb76-eb')
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.512 187216 INFO nova.virt.libvirt.driver [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Deleting instance files /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71_del
Nov 25 19:15:58 compute-0 nova_compute[187212]: 2025-11-25 19:15:58.513 187216 INFO nova.virt.libvirt.driver [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Deletion of /var/lib/nova/instances/909b423a-9e57-4bb8-b6b5-719b05724d71_del complete
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.027 187216 INFO nova.compute.manager [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Took 1.35 seconds to destroy the instance on the hypervisor.
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.028 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.029 187216 DEBUG nova.compute.manager [-] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.029 187216 DEBUG nova.network.neutron [-] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.029 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.231 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:15:59 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:59.465 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:15:59 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:15:59.466 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.467 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.567 187216 DEBUG nova.compute.manager [req-208fef5b-8d88-4a9c-87c1-2254869cf7a4 req-e547da6f-5175-470e-85cb-82641a30362a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Received event network-vif-deleted-e568cb76-eb81-4449-aed6-d84ad4a0f086 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.568 187216 INFO nova.compute.manager [req-208fef5b-8d88-4a9c-87c1-2254869cf7a4 req-e547da6f-5175-470e-85cb-82641a30362a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Neutron deleted interface e568cb76-eb81-4449-aed6-d84ad4a0f086; detaching it from the instance and deleting it from the info cache
Nov 25 19:15:59 compute-0 nova_compute[187212]: 2025-11-25 19:15:59.569 187216 DEBUG nova.network.neutron [req-208fef5b-8d88-4a9c-87c1-2254869cf7a4 req-e547da6f-5175-470e-85cb-82641a30362a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:15:59 compute-0 podman[197585]: time="2025-11-25T19:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:15:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:15:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3079 "" "Go-http-client/1.1"
Nov 25 19:16:00 compute-0 nova_compute[187212]: 2025-11-25 19:16:00.008 187216 DEBUG nova.network.neutron [-] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:16:00 compute-0 nova_compute[187212]: 2025-11-25 19:16:00.079 187216 DEBUG nova.compute.manager [req-208fef5b-8d88-4a9c-87c1-2254869cf7a4 req-e547da6f-5175-470e-85cb-82641a30362a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Detach interface failed, port_id=e568cb76-eb81-4449-aed6-d84ad4a0f086, reason: Instance 909b423a-9e57-4bb8-b6b5-719b05724d71 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:16:00 compute-0 nova_compute[187212]: 2025-11-25 19:16:00.415 187216 DEBUG nova.compute.manager [req-0ccced6b-0bfb-41f6-8979-56fa37cd32f2 req-25f01d79-d57d-438b-aa86-6d176ba88966 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Received event network-vif-unplugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:16:00 compute-0 nova_compute[187212]: 2025-11-25 19:16:00.415 187216 DEBUG oslo_concurrency.lockutils [req-0ccced6b-0bfb-41f6-8979-56fa37cd32f2 req-25f01d79-d57d-438b-aa86-6d176ba88966 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:00 compute-0 nova_compute[187212]: 2025-11-25 19:16:00.416 187216 DEBUG oslo_concurrency.lockutils [req-0ccced6b-0bfb-41f6-8979-56fa37cd32f2 req-25f01d79-d57d-438b-aa86-6d176ba88966 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:00 compute-0 nova_compute[187212]: 2025-11-25 19:16:00.416 187216 DEBUG oslo_concurrency.lockutils [req-0ccced6b-0bfb-41f6-8979-56fa37cd32f2 req-25f01d79-d57d-438b-aa86-6d176ba88966 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:00 compute-0 nova_compute[187212]: 2025-11-25 19:16:00.416 187216 DEBUG nova.compute.manager [req-0ccced6b-0bfb-41f6-8979-56fa37cd32f2 req-25f01d79-d57d-438b-aa86-6d176ba88966 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] No waiting events found dispatching network-vif-unplugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:16:00 compute-0 nova_compute[187212]: 2025-11-25 19:16:00.417 187216 DEBUG nova.compute.manager [req-0ccced6b-0bfb-41f6-8979-56fa37cd32f2 req-25f01d79-d57d-438b-aa86-6d176ba88966 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Received event network-vif-unplugged-e568cb76-eb81-4449-aed6-d84ad4a0f086 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:16:00 compute-0 nova_compute[187212]: 2025-11-25 19:16:00.515 187216 INFO nova.compute.manager [-] [instance: 909b423a-9e57-4bb8-b6b5-719b05724d71] Took 1.49 seconds to deallocate network for instance.
Nov 25 19:16:01 compute-0 nova_compute[187212]: 2025-11-25 19:16:01.048 187216 DEBUG oslo_concurrency.lockutils [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:01 compute-0 nova_compute[187212]: 2025-11-25 19:16:01.049 187216 DEBUG oslo_concurrency.lockutils [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:01 compute-0 nova_compute[187212]: 2025-11-25 19:16:01.130 187216 DEBUG nova.compute.provider_tree [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:16:01 compute-0 openstack_network_exporter[199731]: ERROR   19:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:16:01 compute-0 openstack_network_exporter[199731]: ERROR   19:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:16:01 compute-0 openstack_network_exporter[199731]: ERROR   19:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:16:01 compute-0 openstack_network_exporter[199731]: ERROR   19:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:16:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:16:01 compute-0 openstack_network_exporter[199731]: ERROR   19:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:16:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:16:01 compute-0 nova_compute[187212]: 2025-11-25 19:16:01.672 187216 DEBUG nova.scheduler.client.report [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:16:02 compute-0 nova_compute[187212]: 2025-11-25 19:16:02.108 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:02 compute-0 nova_compute[187212]: 2025-11-25 19:16:02.188 187216 DEBUG oslo_concurrency.lockutils [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.139s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:02 compute-0 podman[212521]: 2025-11-25 19:16:02.196552905 +0000 UTC m=+0.113188143 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 19:16:02 compute-0 nova_compute[187212]: 2025-11-25 19:16:02.222 187216 INFO nova.scheduler.client.report [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Deleted allocations for instance 909b423a-9e57-4bb8-b6b5-719b05724d71
Nov 25 19:16:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:02.470 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:16:03 compute-0 nova_compute[187212]: 2025-11-25 19:16:03.256 187216 DEBUG oslo_concurrency.lockutils [None req-61eda9eb-68ef-408a-94dc-a5b5c5e60ef8 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "909b423a-9e57-4bb8-b6b5-719b05724d71" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:03 compute-0 nova_compute[187212]: 2025-11-25 19:16:03.509 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.104 187216 DEBUG oslo_concurrency.lockutils [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "dd2a5303-3518-4f79-aa7b-45fc96059d01" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.105 187216 DEBUG oslo_concurrency.lockutils [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.106 187216 DEBUG oslo_concurrency.lockutils [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.106 187216 DEBUG oslo_concurrency.lockutils [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.107 187216 DEBUG oslo_concurrency.lockutils [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.127 187216 INFO nova.compute.manager [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Terminating instance
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.646 187216 DEBUG nova.compute.manager [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:16:05 compute-0 kernel: tapb74c368f-ba (unregistering): left promiscuous mode
Nov 25 19:16:05 compute-0 NetworkManager[55552]: <info>  [1764098165.6746] device (tapb74c368f-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.682 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:05 compute-0 ovn_controller[95465]: 2025-11-25T19:16:05Z|00091|binding|INFO|Releasing lport b74c368f-baf3-47d1-9cfb-df249446cbb3 from this chassis (sb_readonly=0)
Nov 25 19:16:05 compute-0 ovn_controller[95465]: 2025-11-25T19:16:05Z|00092|binding|INFO|Setting lport b74c368f-baf3-47d1-9cfb-df249446cbb3 down in Southbound
Nov 25 19:16:05 compute-0 ovn_controller[95465]: 2025-11-25T19:16:05Z|00093|binding|INFO|Removing iface tapb74c368f-ba ovn-installed in OVS
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.685 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.695 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:ad:b9 10.100.0.6'], port_security=['fa:16:3e:d5:ad:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'dd2a5303-3518-4f79-aa7b-45fc96059d01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '780511b4bf4d49299cc4d9b324261841', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'a4e776e6-0bf6-4a60-969e-a83df4aa40b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b278f2-fcb2-49be-ac5b-e0083010c7b4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=b74c368f-baf3-47d1-9cfb-df249446cbb3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.696 104356 INFO neutron.agent.ovn.metadata.agent [-] Port b74c368f-baf3-47d1-9cfb-df249446cbb3 in datapath 22e324dc-3f92-4b1c-b9f6-81cfabbc2783 unbound from our chassis
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.698 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22e324dc-3f92-4b1c-b9f6-81cfabbc2783, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.699 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[0022f202-5742-4fa9-8953-9cd5ce1eabce]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.700 104356 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783 namespace which is not needed anymore
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.711 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:05 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Nov 25 19:16:05 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 23.928s CPU time.
Nov 25 19:16:05 compute-0 systemd-machined[153494]: Machine qemu-3-instance-00000004 terminated.
Nov 25 19:16:05 compute-0 podman[212551]: 2025-11-25 19:16:05.769714467 +0000 UTC m=+0.066031755 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 19:16:05 compute-0 neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783[210969]: [NOTICE]   (210973) : haproxy version is 3.0.5-8e879a5
Nov 25 19:16:05 compute-0 neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783[210969]: [NOTICE]   (210973) : path to executable is /usr/sbin/haproxy
Nov 25 19:16:05 compute-0 neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783[210969]: [WARNING]  (210973) : Exiting Master process...
Nov 25 19:16:05 compute-0 podman[212591]: 2025-11-25 19:16:05.861677528 +0000 UTC m=+0.043135359 container kill 30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 25 19:16:05 compute-0 neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783[210969]: [ALERT]    (210973) : Current worker (210975) exited with code 143 (Terminated)
Nov 25 19:16:05 compute-0 neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783[210969]: [WARNING]  (210973) : All workers exited. Exiting... (0)
Nov 25 19:16:05 compute-0 systemd[1]: libpod-30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd.scope: Deactivated successfully.
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.913 187216 DEBUG nova.compute.manager [req-0520f076-d739-4391-b337-2d6f1311a31f req-b8227fd4-ff98-4fc1-b913-d26287f49297 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.914 187216 DEBUG oslo_concurrency.lockutils [req-0520f076-d739-4391-b337-2d6f1311a31f req-b8227fd4-ff98-4fc1-b913-d26287f49297 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.915 187216 DEBUG oslo_concurrency.lockutils [req-0520f076-d739-4391-b337-2d6f1311a31f req-b8227fd4-ff98-4fc1-b913-d26287f49297 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.915 187216 DEBUG oslo_concurrency.lockutils [req-0520f076-d739-4391-b337-2d6f1311a31f req-b8227fd4-ff98-4fc1-b913-d26287f49297 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.916 187216 DEBUG nova.compute.manager [req-0520f076-d739-4391-b337-2d6f1311a31f req-b8227fd4-ff98-4fc1-b913-d26287f49297 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] No waiting events found dispatching network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.916 187216 DEBUG nova.compute.manager [req-0520f076-d739-4391-b337-2d6f1311a31f req-b8227fd4-ff98-4fc1-b913-d26287f49297 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.925 187216 INFO nova.virt.libvirt.driver [-] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Instance destroyed successfully.
Nov 25 19:16:05 compute-0 podman[212616]: 2025-11-25 19:16:05.925277745 +0000 UTC m=+0.029404804 container died 30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:16:05 compute-0 nova_compute[187212]: 2025-11-25 19:16:05.926 187216 DEBUG nova.objects.instance [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lazy-loading 'resources' on Instance uuid dd2a5303-3518-4f79-aa7b-45fc96059d01 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:16:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd-userdata-shm.mount: Deactivated successfully.
Nov 25 19:16:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-95d29816be6458cdb1ec30a1e513ae35fdeb26125479a4fc22754edf1e40a904-merged.mount: Deactivated successfully.
Nov 25 19:16:05 compute-0 podman[212616]: 2025-11-25 19:16:05.981033168 +0000 UTC m=+0.085160207 container remove 30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.988 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[17e4d3fa-44d8-4298-891f-32af20891cf5]: (4, ("Tue Nov 25 07:16:05 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783 (30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd)\n30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd\nTue Nov 25 07:16:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783 (30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd)\n30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.989 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[17dbc501-2849-46be-abed-82a2d44ba36b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.990 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22e324dc-3f92-4b1c-b9f6-81cfabbc2783.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.990 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[8141b64f-092d-49d8-9a1b-2a8fb7caabeb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:05.992 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22e324dc-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:16:05 compute-0 systemd[1]: libpod-conmon-30e03b69ea72792bbd6da516484326c566f4600d9861a299131c3d575810abfd.scope: Deactivated successfully.
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.041 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:06 compute-0 kernel: tap22e324dc-30: left promiscuous mode
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.056 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.061 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:06.063 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c3163b4c-21f2-4148-9b4d-4621673475d9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:06.079 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[70b344dc-3191-41fb-8b4d-8083f5a6c8a1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:06.080 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[50a676fe-161e-46cc-8756-8d7ff2a22cd1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:06.101 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[aace91ee-02ae-45de-9fbd-8f9bf52d04c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387718, 'reachable_time': 39214, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212652, 'error': None, 'target': 'ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:06.105 104475 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-22e324dc-3f92-4b1c-b9f6-81cfabbc2783 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 19:16:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:06.105 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[926f97f2-2be8-4379-bfa5-bf15a31f6fd4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d22e324dc\x2d3f92\x2d4b1c\x2db9f6\x2d81cfabbc2783.mount: Deactivated successfully.
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.434 187216 DEBUG nova.virt.libvirt.vif [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:11:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-122760017',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-122760017',id=4,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:12:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='780511b4bf4d49299cc4d9b324261841',ramdisk_id='',reservation_id='r-6sh6lchw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1103022868',owner_user_name='tempest-TestExecuteActionsViaActuator-1103022868-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:13:00Z,user_data=None,user_id='7c561073d7c34a029574a6e2fb952944',uuid=dd2a5303-3518-4f79-aa7b-45fc96059d01,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.435 187216 DEBUG nova.network.os_vif_util [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converting VIF {"id": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "address": "fa:16:3e:d5:ad:b9", "network": {"id": "22e324dc-3f92-4b1c-b9f6-81cfabbc2783", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1292145433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b7f575a862343fbb3396239106e3968", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb74c368f-ba", "ovs_interfaceid": "b74c368f-baf3-47d1-9cfb-df249446cbb3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.436 187216 DEBUG nova.network.os_vif_util [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:ad:b9,bridge_name='br-int',has_traffic_filtering=True,id=b74c368f-baf3-47d1-9cfb-df249446cbb3,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb74c368f-ba') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.437 187216 DEBUG os_vif [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:ad:b9,bridge_name='br-int',has_traffic_filtering=True,id=b74c368f-baf3-47d1-9cfb-df249446cbb3,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb74c368f-ba') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.439 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.440 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb74c368f-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.442 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.443 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.445 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.445 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3f95acd4-5d09-461c-8dc8-a16d8cecacd0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.446 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.448 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.450 187216 INFO os_vif [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:ad:b9,bridge_name='br-int',has_traffic_filtering=True,id=b74c368f-baf3-47d1-9cfb-df249446cbb3,network=Network(22e324dc-3f92-4b1c-b9f6-81cfabbc2783),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb74c368f-ba')
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.451 187216 INFO nova.virt.libvirt.driver [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Deleting instance files /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01_del
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.456 187216 INFO nova.virt.libvirt.driver [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Deletion of /var/lib/nova/instances/dd2a5303-3518-4f79-aa7b-45fc96059d01_del complete
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.972 187216 INFO nova.compute.manager [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Took 1.32 seconds to destroy the instance on the hypervisor.
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.972 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.973 187216 DEBUG nova.compute.manager [-] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.973 187216 DEBUG nova.network.neutron [-] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:16:06 compute-0 nova_compute[187212]: 2025-11-25 19:16:06.974 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:16:07 compute-0 nova_compute[187212]: 2025-11-25 19:16:07.142 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:07 compute-0 nova_compute[187212]: 2025-11-25 19:16:07.261 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.093 187216 DEBUG nova.network.neutron [-] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.206 187216 DEBUG nova.compute.manager [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.206 187216 DEBUG oslo_concurrency.lockutils [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.206 187216 DEBUG oslo_concurrency.lockutils [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.206 187216 DEBUG oslo_concurrency.lockutils [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.206 187216 DEBUG nova.compute.manager [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] No waiting events found dispatching network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.207 187216 DEBUG nova.compute.manager [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-vif-unplugged-b74c368f-baf3-47d1-9cfb-df249446cbb3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.207 187216 DEBUG nova.compute.manager [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Received event network-vif-deleted-b74c368f-baf3-47d1-9cfb-df249446cbb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.207 187216 INFO nova.compute.manager [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Neutron deleted interface b74c368f-baf3-47d1-9cfb-df249446cbb3; detaching it from the instance and deleting it from the info cache
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.207 187216 DEBUG nova.network.neutron [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.600 187216 INFO nova.compute.manager [-] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Took 1.63 seconds to deallocate network for instance.
Nov 25 19:16:08 compute-0 nova_compute[187212]: 2025-11-25 19:16:08.715 187216 DEBUG nova.compute.manager [req-67544037-ec4d-4af4-84b8-3fb433affe55 req-095557b5-b234-4849-bdb5-c592475efdf8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: dd2a5303-3518-4f79-aa7b-45fc96059d01] Detach interface failed, port_id=b74c368f-baf3-47d1-9cfb-df249446cbb3, reason: Instance dd2a5303-3518-4f79-aa7b-45fc96059d01 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:16:09 compute-0 nova_compute[187212]: 2025-11-25 19:16:09.118 187216 DEBUG oslo_concurrency.lockutils [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:09 compute-0 nova_compute[187212]: 2025-11-25 19:16:09.119 187216 DEBUG oslo_concurrency.lockutils [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:09 compute-0 nova_compute[187212]: 2025-11-25 19:16:09.190 187216 DEBUG nova.compute.provider_tree [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:16:09 compute-0 nova_compute[187212]: 2025-11-25 19:16:09.697 187216 DEBUG nova.scheduler.client.report [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:16:10 compute-0 nova_compute[187212]: 2025-11-25 19:16:10.206 187216 DEBUG oslo_concurrency.lockutils [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:10 compute-0 nova_compute[187212]: 2025-11-25 19:16:10.244 187216 INFO nova.scheduler.client.report [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Deleted allocations for instance dd2a5303-3518-4f79-aa7b-45fc96059d01
Nov 25 19:16:11 compute-0 podman[212653]: 2025-11-25 19:16:11.201656221 +0000 UTC m=+0.119418022 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 19:16:11 compute-0 nova_compute[187212]: 2025-11-25 19:16:11.333 187216 DEBUG oslo_concurrency.lockutils [None req-606c56dc-bebf-4b6e-baa4-24df113871f2 7c561073d7c34a029574a6e2fb952944 780511b4bf4d49299cc4d9b324261841 - - default default] Lock "dd2a5303-3518-4f79-aa7b-45fc96059d01" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.228s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:11 compute-0 nova_compute[187212]: 2025-11-25 19:16:11.447 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:12 compute-0 nova_compute[187212]: 2025-11-25 19:16:12.144 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:15 compute-0 podman[212675]: 2025-11-25 19:16:15.164791441 +0000 UTC m=+0.081391363 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:16:15 compute-0 nova_compute[187212]: 2025-11-25 19:16:15.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:16:15 compute-0 nova_compute[187212]: 2025-11-25 19:16:15.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:16:16 compute-0 nova_compute[187212]: 2025-11-25 19:16:16.482 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:17 compute-0 nova_compute[187212]: 2025-11-25 19:16:17.147 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:17 compute-0 nova_compute[187212]: 2025-11-25 19:16:17.460 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:19 compute-0 nova_compute[187212]: 2025-11-25 19:16:19.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.485 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.695 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.695 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.696 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.958 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.959 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.992 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.993 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5831MB free_disk=72.99675369262695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.994 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:21 compute-0 nova_compute[187212]: 2025-11-25 19:16:21.994 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:22 compute-0 nova_compute[187212]: 2025-11-25 19:16:22.189 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:23 compute-0 nova_compute[187212]: 2025-11-25 19:16:23.055 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:16:23 compute-0 nova_compute[187212]: 2025-11-25 19:16:23.056 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:16:21 up  1:08,  0 user,  load average: 0.49, 0.57, 0.52\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:16:23 compute-0 nova_compute[187212]: 2025-11-25 19:16:23.089 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:16:23 compute-0 nova_compute[187212]: 2025-11-25 19:16:23.601 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:16:24 compute-0 nova_compute[187212]: 2025-11-25 19:16:24.119 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:16:24 compute-0 nova_compute[187212]: 2025-11-25 19:16:24.120 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:25 compute-0 nova_compute[187212]: 2025-11-25 19:16:25.115 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:16:25 compute-0 nova_compute[187212]: 2025-11-25 19:16:25.116 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:16:26 compute-0 nova_compute[187212]: 2025-11-25 19:16:26.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:16:26 compute-0 nova_compute[187212]: 2025-11-25 19:16:26.488 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:27 compute-0 podman[212697]: 2025-11-25 19:16:27.164701915 +0000 UTC m=+0.078827934 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:16:27 compute-0 nova_compute[187212]: 2025-11-25 19:16:27.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:16:27 compute-0 nova_compute[187212]: 2025-11-25 19:16:27.192 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:29 compute-0 podman[197585]: time="2025-11-25T19:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:16:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:16:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2610 "" "Go-http-client/1.1"
Nov 25 19:16:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:31.087 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:16:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:31.088 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:16:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:31.088 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:16:31 compute-0 openstack_network_exporter[199731]: ERROR   19:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:16:31 compute-0 openstack_network_exporter[199731]: ERROR   19:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:16:31 compute-0 openstack_network_exporter[199731]: ERROR   19:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:16:31 compute-0 openstack_network_exporter[199731]: ERROR   19:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:16:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:16:31 compute-0 openstack_network_exporter[199731]: ERROR   19:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:16:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:16:31 compute-0 nova_compute[187212]: 2025-11-25 19:16:31.489 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:32 compute-0 nova_compute[187212]: 2025-11-25 19:16:32.193 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:32.711 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:69:db 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8adb6455c6724d8fa5a19dd6d4d677ea', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=874f2765-a6cf-42e0-8c42-359508942dc4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0c24d2e0-449b-4c43-8064-1c741005deb8) old=Port_Binding(mac=['fa:16:3e:3c:69:db'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8adb6455c6724d8fa5a19dd6d4d677ea', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:16:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:32.713 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0c24d2e0-449b-4c43-8064-1c741005deb8 in datapath d59a13e0-0444-4801-a38d-c9b9b692bc71 updated
Nov 25 19:16:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:32.714 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d59a13e0-0444-4801-a38d-c9b9b692bc71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:16:32 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:32.715 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[fde4c26f-78ab-45f5-9185-70878a71dac6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:33 compute-0 podman[212723]: 2025-11-25 19:16:33.239831853 +0000 UTC m=+0.162769366 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4)
Nov 25 19:16:36 compute-0 podman[212749]: 2025-11-25 19:16:36.180817121 +0000 UTC m=+0.102494230 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:16:36 compute-0 nova_compute[187212]: 2025-11-25 19:16:36.491 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:37 compute-0 nova_compute[187212]: 2025-11-25 19:16:37.194 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:39.546 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:01:70 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e5dba86e-b13b-43f6-b255-e4630fdf5be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5dba86e-b13b-43f6-b255-e4630fdf5be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '492e2572d9d24bf29c79f2b1c9dab462', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee976277-807b-4574-b00c-18c492f2efb9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0d87f8ac-dc77-49f9-be2a-3476a842d659) old=Port_Binding(mac=['fa:16:3e:94:01:70'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e5dba86e-b13b-43f6-b255-e4630fdf5be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5dba86e-b13b-43f6-b255-e4630fdf5be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '492e2572d9d24bf29c79f2b1c9dab462', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:16:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:39.547 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0d87f8ac-dc77-49f9-be2a-3476a842d659 in datapath e5dba86e-b13b-43f6-b255-e4630fdf5be3 updated
Nov 25 19:16:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:39.549 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e5dba86e-b13b-43f6-b255-e4630fdf5be3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:16:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:16:39.550 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[decd04cf-517c-49a5-b8ee-d2cdfdc829a9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:16:41 compute-0 nova_compute[187212]: 2025-11-25 19:16:41.528 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:42 compute-0 podman[212770]: 2025-11-25 19:16:42.167711319 +0000 UTC m=+0.088928310 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:16:42 compute-0 nova_compute[187212]: 2025-11-25 19:16:42.196 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:46 compute-0 podman[212792]: 2025-11-25 19:16:46.173814784 +0000 UTC m=+0.090556184 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Nov 25 19:16:46 compute-0 nova_compute[187212]: 2025-11-25 19:16:46.529 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:47 compute-0 nova_compute[187212]: 2025-11-25 19:16:47.198 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:51 compute-0 nova_compute[187212]: 2025-11-25 19:16:51.531 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:52 compute-0 nova_compute[187212]: 2025-11-25 19:16:52.202 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:54 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 25 19:16:54 compute-0 ovn_controller[95465]: 2025-11-25T19:16:54Z|00094|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 25 19:16:56 compute-0 nova_compute[187212]: 2025-11-25 19:16:56.569 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:57 compute-0 nova_compute[187212]: 2025-11-25 19:16:57.203 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:16:58 compute-0 podman[212813]: 2025-11-25 19:16:58.120980954 +0000 UTC m=+0.054070208 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:16:59 compute-0 podman[197585]: time="2025-11-25T19:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:16:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:16:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2622 "" "Go-http-client/1.1"
Nov 25 19:17:01 compute-0 openstack_network_exporter[199731]: ERROR   19:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:17:01 compute-0 openstack_network_exporter[199731]: ERROR   19:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:17:01 compute-0 openstack_network_exporter[199731]: ERROR   19:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:17:01 compute-0 openstack_network_exporter[199731]: ERROR   19:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:17:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:17:01 compute-0 openstack_network_exporter[199731]: ERROR   19:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:17:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:17:01 compute-0 nova_compute[187212]: 2025-11-25 19:17:01.571 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:02 compute-0 nova_compute[187212]: 2025-11-25 19:17:02.252 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:04 compute-0 podman[212838]: 2025-11-25 19:17:04.211903015 +0000 UTC m=+0.143635124 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:17:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:05.727 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:17:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:05.728 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:17:05 compute-0 nova_compute[187212]: 2025-11-25 19:17:05.731 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:06 compute-0 nova_compute[187212]: 2025-11-25 19:17:06.573 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:07 compute-0 podman[212865]: 2025-11-25 19:17:07.147586567 +0000 UTC m=+0.071506374 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Nov 25 19:17:07 compute-0 nova_compute[187212]: 2025-11-25 19:17:07.254 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:11 compute-0 nova_compute[187212]: 2025-11-25 19:17:11.576 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:11 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:11.730 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:17:12 compute-0 nova_compute[187212]: 2025-11-25 19:17:12.256 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:13 compute-0 podman[212884]: 2025-11-25 19:17:13.160914417 +0000 UTC m=+0.085885507 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, config_id=edpm)
Nov 25 19:17:13 compute-0 nova_compute[187212]: 2025-11-25 19:17:13.909 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "868f271d-a554-4ac7-8fc5-de42da59e9f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:13 compute-0 nova_compute[187212]: 2025-11-25 19:17:13.910 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:14 compute-0 nova_compute[187212]: 2025-11-25 19:17:14.415 187216 DEBUG nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:17:14 compute-0 nova_compute[187212]: 2025-11-25 19:17:14.975 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:14 compute-0 nova_compute[187212]: 2025-11-25 19:17:14.976 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:14 compute-0 nova_compute[187212]: 2025-11-25 19:17:14.988 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:17:14 compute-0 nova_compute[187212]: 2025-11-25 19:17:14.989 187216 INFO nova.compute.claims [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:17:16 compute-0 nova_compute[187212]: 2025-11-25 19:17:16.055 187216 DEBUG nova.compute.provider_tree [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:17:16 compute-0 nova_compute[187212]: 2025-11-25 19:17:16.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:17:16 compute-0 nova_compute[187212]: 2025-11-25 19:17:16.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:17:16 compute-0 nova_compute[187212]: 2025-11-25 19:17:16.563 187216 DEBUG nova.scheduler.client.report [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:17:16 compute-0 nova_compute[187212]: 2025-11-25 19:17:16.619 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:17 compute-0 nova_compute[187212]: 2025-11-25 19:17:17.078 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:17 compute-0 nova_compute[187212]: 2025-11-25 19:17:17.080 187216 DEBUG nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:17:17 compute-0 podman[212905]: 2025-11-25 19:17:17.17794168 +0000 UTC m=+0.096543858 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 25 19:17:17 compute-0 nova_compute[187212]: 2025-11-25 19:17:17.256 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:17 compute-0 nova_compute[187212]: 2025-11-25 19:17:17.595 187216 DEBUG nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:17:17 compute-0 nova_compute[187212]: 2025-11-25 19:17:17.596 187216 DEBUG nova.network.neutron [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:17:17 compute-0 nova_compute[187212]: 2025-11-25 19:17:17.596 187216 WARNING neutronclient.v2_0.client [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:17:17 compute-0 nova_compute[187212]: 2025-11-25 19:17:17.597 187216 WARNING neutronclient.v2_0.client [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:17:18 compute-0 nova_compute[187212]: 2025-11-25 19:17:18.105 187216 INFO nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:17:18 compute-0 nova_compute[187212]: 2025-11-25 19:17:18.357 187216 DEBUG nova.network.neutron [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Successfully created port: 590629bf-d033-4f3a-a3d8-5eb92ceecde0 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:17:18 compute-0 nova_compute[187212]: 2025-11-25 19:17:18.614 187216 DEBUG nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.074 187216 DEBUG nova.network.neutron [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Successfully updated port: 590629bf-d033-4f3a-a3d8-5eb92ceecde0 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.157 187216 DEBUG nova.compute.manager [req-5eac9779-6bd8-448f-8c18-84ef210f334c req-7378dfbd-d8c6-468a-a073-8ccaba993496 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Received event network-changed-590629bf-d033-4f3a-a3d8-5eb92ceecde0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.157 187216 DEBUG nova.compute.manager [req-5eac9779-6bd8-448f-8c18-84ef210f334c req-7378dfbd-d8c6-468a-a073-8ccaba993496 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Refreshing instance network info cache due to event network-changed-590629bf-d033-4f3a-a3d8-5eb92ceecde0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.158 187216 DEBUG oslo_concurrency.lockutils [req-5eac9779-6bd8-448f-8c18-84ef210f334c req-7378dfbd-d8c6-468a-a073-8ccaba993496 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-868f271d-a554-4ac7-8fc5-de42da59e9f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.158 187216 DEBUG oslo_concurrency.lockutils [req-5eac9779-6bd8-448f-8c18-84ef210f334c req-7378dfbd-d8c6-468a-a073-8ccaba993496 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-868f271d-a554-4ac7-8fc5-de42da59e9f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.158 187216 DEBUG nova.network.neutron [req-5eac9779-6bd8-448f-8c18-84ef210f334c req-7378dfbd-d8c6-468a-a073-8ccaba993496 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Refreshing network info cache for port 590629bf-d033-4f3a-a3d8-5eb92ceecde0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.587 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "refresh_cache-868f271d-a554-4ac7-8fc5-de42da59e9f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.633 187216 DEBUG nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.635 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.636 187216 INFO nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Creating image(s)
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.636 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "/var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.637 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "/var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.638 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "/var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.639 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.645 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.647 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.668 187216 WARNING neutronclient.v2_0.client [req-5eac9779-6bd8-448f-8c18-84ef210f334c req-7378dfbd-d8c6-468a-a073-8ccaba993496 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.738 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.739 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.740 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.741 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.748 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.749 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.807 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.808 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.853 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.855 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.856 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.914 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.915 187216 DEBUG nova.virt.disk.api [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Checking if we can resize image /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.916 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.997 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:17:19 compute-0 nova_compute[187212]: 2025-11-25 19:17:19.999 187216 DEBUG nova.virt.disk.api [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Cannot resize image /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.000 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.000 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Ensure instance console log exists: /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.001 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.002 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.002 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.274 187216 DEBUG nova.network.neutron [req-5eac9779-6bd8-448f-8c18-84ef210f334c req-7378dfbd-d8c6-468a-a073-8ccaba993496 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.443 187216 DEBUG nova.network.neutron [req-5eac9779-6bd8-448f-8c18-84ef210f334c req-7378dfbd-d8c6-468a-a073-8ccaba993496 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.953 187216 DEBUG oslo_concurrency.lockutils [req-5eac9779-6bd8-448f-8c18-84ef210f334c req-7378dfbd-d8c6-468a-a073-8ccaba993496 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-868f271d-a554-4ac7-8fc5-de42da59e9f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.955 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquired lock "refresh_cache-868f271d-a554-4ac7-8fc5-de42da59e9f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:17:20 compute-0 nova_compute[187212]: 2025-11-25 19:17:20.955 187216 DEBUG nova.network.neutron [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.622 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.687 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.904 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.905 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.944 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.944 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5825MB free_disk=72.99654388427734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.945 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:21 compute-0 nova_compute[187212]: 2025-11-25 19:17:21.945 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:22 compute-0 nova_compute[187212]: 2025-11-25 19:17:22.291 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:22 compute-0 nova_compute[187212]: 2025-11-25 19:17:22.298 187216 DEBUG nova.network.neutron [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.004 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 868f271d-a554-4ac7-8fc5-de42da59e9f4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.004 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.005 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:17:21 up  1:09,  0 user,  load average: 0.44, 0.53, 0.51\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_492e2572d9d24bf29c79f2b1c9dab462': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.065 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.250 187216 WARNING neutronclient.v2_0.client [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.414 187216 DEBUG nova.network.neutron [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Updating instance_info_cache with network_info: [{"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.573 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.921 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Releasing lock "refresh_cache-868f271d-a554-4ac7-8fc5-de42da59e9f4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.922 187216 DEBUG nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Instance network_info: |[{"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.926 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Start _get_guest_xml network_info=[{"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.931 187216 WARNING nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.932 187216 DEBUG nova.virt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-1434666508', uuid='868f271d-a554-4ac7-8fc5-de42da59e9f4'), owner=OwnerMeta(userid='81221121fff040f195867ab25b59f26e', username='tempest-TestExecuteBasicStrategy-625591926-project-admin', projectid='492e2572d9d24bf29c79f2b1c9dab462', projectname='tempest-TestExecuteBasicStrategy-625591926'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764098243.932712) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.938 187216 DEBUG nova.virt.libvirt.host [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.939 187216 DEBUG nova.virt.libvirt.host [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.943 187216 DEBUG nova.virt.libvirt.host [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.943 187216 DEBUG nova.virt.libvirt.host [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.945 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.946 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.946 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.947 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.947 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.948 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.948 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.949 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.949 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.950 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.950 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.951 187216 DEBUG nova.virt.hardware [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.957 187216 DEBUG nova.virt.libvirt.vif [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1434666508',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1434666508',id=11,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='492e2572d9d24bf29c79f2b1c9dab462',ramdisk_id='',reservation_id='r-06y2bo17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-625591926',owner_user_name='tempest-TestExecuteBasicStrategy-625591926-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:17:18Z,user_data=None,user_id='81221121fff040f195867ab25b59f26e',uuid=868f271d-a554-4ac7-8fc5-de42da59e9f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.957 187216 DEBUG nova.network.os_vif_util [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Converting VIF {"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.959 187216 DEBUG nova.network.os_vif_util [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:a6:bc,bridge_name='br-int',has_traffic_filtering=True,id=590629bf-d033-4f3a-a3d8-5eb92ceecde0,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap590629bf-d0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:17:23 compute-0 nova_compute[187212]: 2025-11-25 19:17:23.960 187216 DEBUG nova.objects.instance [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lazy-loading 'pci_devices' on Instance uuid 868f271d-a554-4ac7-8fc5-de42da59e9f4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.084 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.085 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.140s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.469 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <uuid>868f271d-a554-4ac7-8fc5-de42da59e9f4</uuid>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <name>instance-0000000b</name>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1434666508</nova:name>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:17:23</nova:creationTime>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:17:24 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:17:24 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:user uuid="81221121fff040f195867ab25b59f26e">tempest-TestExecuteBasicStrategy-625591926-project-admin</nova:user>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:project uuid="492e2572d9d24bf29c79f2b1c9dab462">tempest-TestExecuteBasicStrategy-625591926</nova:project>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         <nova:port uuid="590629bf-d033-4f3a-a3d8-5eb92ceecde0">
Nov 25 19:17:24 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <system>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <entry name="serial">868f271d-a554-4ac7-8fc5-de42da59e9f4</entry>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <entry name="uuid">868f271d-a554-4ac7-8fc5-de42da59e9f4</entry>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     </system>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <os>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   </os>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <features>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   </features>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk.config"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:f2:a6:bc"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <target dev="tap590629bf-d0"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/console.log" append="off"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <video>
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     </video>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:17:24 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:17:24 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:17:24 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:17:24 compute-0 nova_compute[187212]: </domain>
Nov 25 19:17:24 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.471 187216 DEBUG nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Preparing to wait for external event network-vif-plugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.471 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.471 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.472 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.472 187216 DEBUG nova.virt.libvirt.vif [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1434666508',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1434666508',id=11,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='492e2572d9d24bf29c79f2b1c9dab462',ramdisk_id='',reservation_id='r-06y2bo17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-625591926',owner_user_name='tempest-TestExecuteBasicStrategy-625591926-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:17:18Z,user_data=None,user_id='81221121fff040f195867ab25b59f26e',uuid=868f271d-a554-4ac7-8fc5-de42da59e9f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.473 187216 DEBUG nova.network.os_vif_util [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Converting VIF {"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.473 187216 DEBUG nova.network.os_vif_util [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:a6:bc,bridge_name='br-int',has_traffic_filtering=True,id=590629bf-d033-4f3a-a3d8-5eb92ceecde0,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap590629bf-d0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.474 187216 DEBUG os_vif [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:a6:bc,bridge_name='br-int',has_traffic_filtering=True,id=590629bf-d033-4f3a-a3d8-5eb92ceecde0,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap590629bf-d0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.474 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.475 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.475 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.476 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.476 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '4ab2bee1-a4f4-5799-994b-019df7797687', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.503 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.504 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.507 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.508 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap590629bf-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.508 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap590629bf-d0, col_values=(('qos', UUID('91be79bb-02fa-4afb-9d91-eb40c8281495')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.509 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap590629bf-d0, col_values=(('external_ids', {'iface-id': '590629bf-d033-4f3a-a3d8-5eb92ceecde0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:a6:bc', 'vm-uuid': '868f271d-a554-4ac7-8fc5-de42da59e9f4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.510 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:24 compute-0 NetworkManager[55552]: <info>  [1764098244.5120] manager: (tap590629bf-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.513 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.516 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:24 compute-0 nova_compute[187212]: 2025-11-25 19:17:24.517 187216 INFO os_vif [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:a6:bc,bridge_name='br-int',has_traffic_filtering=True,id=590629bf-d033-4f3a-a3d8-5eb92ceecde0,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap590629bf-d0')
Nov 25 19:17:26 compute-0 nova_compute[187212]: 2025-11-25 19:17:26.085 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:17:26 compute-0 nova_compute[187212]: 2025-11-25 19:17:26.086 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:17:26 compute-0 nova_compute[187212]: 2025-11-25 19:17:26.087 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:17:26 compute-0 nova_compute[187212]: 2025-11-25 19:17:26.143 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:17:26 compute-0 nova_compute[187212]: 2025-11-25 19:17:26.143 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:17:26 compute-0 nova_compute[187212]: 2025-11-25 19:17:26.144 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] No VIF found with MAC fa:16:3e:f2:a6:bc, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:17:26 compute-0 nova_compute[187212]: 2025-11-25 19:17:26.145 187216 INFO nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Using config drive
Nov 25 19:17:26 compute-0 nova_compute[187212]: 2025-11-25 19:17:26.736 187216 WARNING neutronclient.v2_0.client [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:17:27 compute-0 nova_compute[187212]: 2025-11-25 19:17:27.292 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:27 compute-0 nova_compute[187212]: 2025-11-25 19:17:27.387 187216 INFO nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Creating config drive at /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk.config
Nov 25 19:17:27 compute-0 nova_compute[187212]: 2025-11-25 19:17:27.393 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp4paryajs execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:17:27 compute-0 nova_compute[187212]: 2025-11-25 19:17:27.535 187216 DEBUG oslo_concurrency.processutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp4paryajs" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:17:27 compute-0 NetworkManager[55552]: <info>  [1764098247.6303] manager: (tap590629bf-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Nov 25 19:17:27 compute-0 kernel: tap590629bf-d0: entered promiscuous mode
Nov 25 19:17:27 compute-0 ovn_controller[95465]: 2025-11-25T19:17:27Z|00095|binding|INFO|Claiming lport 590629bf-d033-4f3a-a3d8-5eb92ceecde0 for this chassis.
Nov 25 19:17:27 compute-0 ovn_controller[95465]: 2025-11-25T19:17:27Z|00096|binding|INFO|590629bf-d033-4f3a-a3d8-5eb92ceecde0: Claiming fa:16:3e:f2:a6:bc 10.100.0.4
Nov 25 19:17:27 compute-0 nova_compute[187212]: 2025-11-25 19:17:27.634 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:27 compute-0 nova_compute[187212]: 2025-11-25 19:17:27.639 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:27 compute-0 nova_compute[187212]: 2025-11-25 19:17:27.647 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:27 compute-0 nova_compute[187212]: 2025-11-25 19:17:27.650 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.658 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:a6:bc 10.100.0.4'], port_security=['fa:16:3e:f2:a6:bc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '868f271d-a554-4ac7-8fc5-de42da59e9f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '492e2572d9d24bf29c79f2b1c9dab462', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e9615ad8-5145-4bd7-be4d-76e46400d6dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=874f2765-a6cf-42e0-8c42-359508942dc4, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=590629bf-d033-4f3a-a3d8-5eb92ceecde0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.659 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 590629bf-d033-4f3a-a3d8-5eb92ceecde0 in datapath d59a13e0-0444-4801-a38d-c9b9b692bc71 bound to our chassis
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.661 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d59a13e0-0444-4801-a38d-c9b9b692bc71
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.679 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d31584-1df0-4130-b1c8-e592cde5605d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.680 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd59a13e0-01 in ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.682 208756 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd59a13e0-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.682 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[cc35b66c-ec56-4dac-81f5-17240378d91a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.684 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[d59b7df8-e518-4119-bc93-84c9bc7c62fd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 systemd-machined[153494]: New machine qemu-8-instance-0000000b.
Nov 25 19:17:27 compute-0 ovn_controller[95465]: 2025-11-25T19:17:27Z|00097|binding|INFO|Setting lport 590629bf-d033-4f3a-a3d8-5eb92ceecde0 ovn-installed in OVS
Nov 25 19:17:27 compute-0 ovn_controller[95465]: 2025-11-25T19:17:27Z|00098|binding|INFO|Setting lport 590629bf-d033-4f3a-a3d8-5eb92ceecde0 up in Southbound
Nov 25 19:17:27 compute-0 nova_compute[187212]: 2025-11-25 19:17:27.703 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.703 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[066b90a1-a723-4ada-9a87-a10c6c0cdbbd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000b.
Nov 25 19:17:27 compute-0 systemd-udevd[212965]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.722 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7d8493-be20-49c9-8525-7f16f6f4c951]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 NetworkManager[55552]: <info>  [1764098247.7442] device (tap590629bf-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:17:27 compute-0 NetworkManager[55552]: <info>  [1764098247.7452] device (tap590629bf-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.770 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[362c49c5-a8ad-4641-aab4-7124c39bb581]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.776 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ef86b75f-4fbe-4e5d-a0f9-09b9a8bd2130]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 NetworkManager[55552]: <info>  [1764098247.7777] manager: (tapd59a13e0-00): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.823 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[9f12cee9-9af3-4d03-a90c-b5d485e3df43]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.829 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[124a071b-889d-4199-b816-f80577005ea2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 NetworkManager[55552]: <info>  [1764098247.8650] device (tapd59a13e0-00): carrier: link connected
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.875 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[da216fbe-7fed-4ab7-a2af-1a4d918db059]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.904 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[b8918e19-6b06-41f4-a3a2-f598c0e6a1ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd59a13e0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:69:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419553, 'reachable_time': 41528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212995, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.925 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[4acc2859-a3e6-4481-a348-a951ad049983]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:69db'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 419553, 'tstamp': 419553}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212996, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.948 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[59a89426-6716-4013-8394-c2f94dbf625b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd59a13e0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:69:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419553, 'reachable_time': 41528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212997, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:27.990 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[58bb0106-ec17-4bd5-baf2-9ac9fb9f04dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.075 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[3654dc8c-482c-459f-8b55-29990e79966c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.076 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd59a13e0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.076 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.077 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd59a13e0-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:17:28 compute-0 NetworkManager[55552]: <info>  [1764098248.0795] manager: (tapd59a13e0-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.079 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:28 compute-0 kernel: tapd59a13e0-00: entered promiscuous mode
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.083 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.084 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd59a13e0-00, col_values=(('external_ids', {'iface-id': '0c24d2e0-449b-4c43-8064-1c741005deb8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:17:28 compute-0 ovn_controller[95465]: 2025-11-25T19:17:28Z|00099|binding|INFO|Releasing lport 0c24d2e0-449b-4c43-8064-1c741005deb8 from this chassis (sb_readonly=0)
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.085 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.108 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.109 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.111 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[152f89e4-571b-4445-bf37-394370b3081a]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.112 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.112 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.112 104356 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d59a13e0-0444-4801-a38d-c9b9b692bc71 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.112 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.113 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c74b88b8-508d-4891-9e19-814b6fdc9321]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.113 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.113 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[2f37db1b-3811-4496-b667-5f63bfef5142]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.114 104356 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: global
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     log         /dev/log local0 debug
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     log-tag     haproxy-metadata-proxy-d59a13e0-0444-4801-a38d-c9b9b692bc71
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     user        root
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     group       root
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     maxconn     1024
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     pidfile     /var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     daemon
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: defaults
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     log global
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     mode http
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     option httplog
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     option dontlognull
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     option http-server-close
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     option forwardfor
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     retries                 3
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     timeout http-request    30s
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     timeout connect         30s
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     timeout client          32s
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     timeout server          32s
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     timeout http-keep-alive 30s
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: listen listener
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     bind 169.254.169.254:80
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:     http-request add-header X-OVN-Network-ID d59a13e0-0444-4801-a38d-c9b9b692bc71
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 19:17:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:28.114 104356 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'env', 'PROCESS_TAG=haproxy-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d59a13e0-0444-4801-a38d-c9b9b692bc71.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.436 187216 DEBUG nova.compute.manager [req-610f9aab-2bb2-4f3a-9054-c923690a0b56 req-7e1772f2-14c0-420c-8b2f-7aaa42f60e17 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Received event network-vif-plugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.437 187216 DEBUG oslo_concurrency.lockutils [req-610f9aab-2bb2-4f3a-9054-c923690a0b56 req-7e1772f2-14c0-420c-8b2f-7aaa42f60e17 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.437 187216 DEBUG oslo_concurrency.lockutils [req-610f9aab-2bb2-4f3a-9054-c923690a0b56 req-7e1772f2-14c0-420c-8b2f-7aaa42f60e17 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.437 187216 DEBUG oslo_concurrency.lockutils [req-610f9aab-2bb2-4f3a-9054-c923690a0b56 req-7e1772f2-14c0-420c-8b2f-7aaa42f60e17 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.438 187216 DEBUG nova.compute.manager [req-610f9aab-2bb2-4f3a-9054-c923690a0b56 req-7e1772f2-14c0-420c-8b2f-7aaa42f60e17 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Processing event network-vif-plugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.439 187216 DEBUG nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.445 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.450 187216 INFO nova.virt.libvirt.driver [-] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Instance spawned successfully.
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.451 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:17:28 compute-0 podman[213036]: 2025-11-25 19:17:28.676084459 +0000 UTC m=+0.081401464 container create a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 25 19:17:28 compute-0 podman[213036]: 2025-11-25 19:17:28.632819337 +0000 UTC m=+0.038136392 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 19:17:28 compute-0 systemd[1]: Started libpod-conmon-a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7.scope.
Nov 25 19:17:28 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:17:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8700569634fb5e954051e483acde542e5495357c6a4b6d321315509d2f9f6721/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 19:17:28 compute-0 podman[213036]: 2025-11-25 19:17:28.80356312 +0000 UTC m=+0.208880165 container init a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:17:28 compute-0 podman[213036]: 2025-11-25 19:17:28.813987445 +0000 UTC m=+0.219304450 container start a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:17:28 compute-0 podman[213049]: 2025-11-25 19:17:28.820878283 +0000 UTC m=+0.090073171 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:17:28 compute-0 neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71[213057]: [NOTICE]   (213078) : New worker (213080) forked
Nov 25 19:17:28 compute-0 neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71[213057]: [NOTICE]   (213078) : Loading success.
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.979 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.980 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.981 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.982 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.983 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:17:28 compute-0 nova_compute[187212]: 2025-11-25 19:17:28.984 187216 DEBUG nova.virt.libvirt.driver [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:17:29 compute-0 nova_compute[187212]: 2025-11-25 19:17:29.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:17:29 compute-0 nova_compute[187212]: 2025-11-25 19:17:29.500 187216 INFO nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Took 9.87 seconds to spawn the instance on the hypervisor.
Nov 25 19:17:29 compute-0 nova_compute[187212]: 2025-11-25 19:17:29.501 187216 DEBUG nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:17:29 compute-0 nova_compute[187212]: 2025-11-25 19:17:29.542 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:29 compute-0 nova_compute[187212]: 2025-11-25 19:17:29.691 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:17:29 compute-0 podman[197585]: time="2025-11-25T19:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:17:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:17:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3080 "" "Go-http-client/1.1"
Nov 25 19:17:30 compute-0 nova_compute[187212]: 2025-11-25 19:17:30.152 187216 INFO nova.compute.manager [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Took 15.23 seconds to build instance.
Nov 25 19:17:30 compute-0 nova_compute[187212]: 2025-11-25 19:17:30.510 187216 DEBUG nova.compute.manager [req-9c16b00c-b144-4bc5-ad06-c8dbfe22e4f7 req-e5bd2939-ab84-4c26-b146-8ef33af0db55 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Received event network-vif-plugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:17:30 compute-0 nova_compute[187212]: 2025-11-25 19:17:30.511 187216 DEBUG oslo_concurrency.lockutils [req-9c16b00c-b144-4bc5-ad06-c8dbfe22e4f7 req-e5bd2939-ab84-4c26-b146-8ef33af0db55 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:30 compute-0 nova_compute[187212]: 2025-11-25 19:17:30.511 187216 DEBUG oslo_concurrency.lockutils [req-9c16b00c-b144-4bc5-ad06-c8dbfe22e4f7 req-e5bd2939-ab84-4c26-b146-8ef33af0db55 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:30 compute-0 nova_compute[187212]: 2025-11-25 19:17:30.511 187216 DEBUG oslo_concurrency.lockutils [req-9c16b00c-b144-4bc5-ad06-c8dbfe22e4f7 req-e5bd2939-ab84-4c26-b146-8ef33af0db55 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:30 compute-0 nova_compute[187212]: 2025-11-25 19:17:30.511 187216 DEBUG nova.compute.manager [req-9c16b00c-b144-4bc5-ad06-c8dbfe22e4f7 req-e5bd2939-ab84-4c26-b146-8ef33af0db55 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] No waiting events found dispatching network-vif-plugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:17:30 compute-0 nova_compute[187212]: 2025-11-25 19:17:30.512 187216 WARNING nova.compute.manager [req-9c16b00c-b144-4bc5-ad06-c8dbfe22e4f7 req-e5bd2939-ab84-4c26-b146-8ef33af0db55 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Received unexpected event network-vif-plugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 for instance with vm_state active and task_state None.
Nov 25 19:17:30 compute-0 nova_compute[187212]: 2025-11-25 19:17:30.666 187216 DEBUG oslo_concurrency.lockutils [None req-692522b7-9f59-468f-9206-3fafea45e792 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.755s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:31.089 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:17:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:31.089 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:17:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:17:31.090 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:17:31 compute-0 openstack_network_exporter[199731]: ERROR   19:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:17:31 compute-0 openstack_network_exporter[199731]: ERROR   19:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:17:31 compute-0 openstack_network_exporter[199731]: ERROR   19:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:17:31 compute-0 openstack_network_exporter[199731]: ERROR   19:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:17:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:17:31 compute-0 openstack_network_exporter[199731]: ERROR   19:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:17:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:17:32 compute-0 nova_compute[187212]: 2025-11-25 19:17:32.294 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:34 compute-0 nova_compute[187212]: 2025-11-25 19:17:34.544 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:35 compute-0 podman[213091]: 2025-11-25 19:17:35.255766816 +0000 UTC m=+0.178231418 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 19:17:37 compute-0 nova_compute[187212]: 2025-11-25 19:17:37.299 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:38 compute-0 podman[213118]: 2025-11-25 19:17:38.183302076 +0000 UTC m=+0.087070719 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 19:17:39 compute-0 nova_compute[187212]: 2025-11-25 19:17:39.548 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:40 compute-0 ovn_controller[95465]: 2025-11-25T19:17:40Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:a6:bc 10.100.0.4
Nov 25 19:17:40 compute-0 ovn_controller[95465]: 2025-11-25T19:17:40Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:a6:bc 10.100.0.4
Nov 25 19:17:42 compute-0 nova_compute[187212]: 2025-11-25 19:17:42.300 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:44 compute-0 podman[213149]: 2025-11-25 19:17:44.165138317 +0000 UTC m=+0.085784013 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:17:44 compute-0 nova_compute[187212]: 2025-11-25 19:17:44.551 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:47 compute-0 nova_compute[187212]: 2025-11-25 19:17:47.304 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:48 compute-0 podman[213171]: 2025-11-25 19:17:48.206903045 +0000 UTC m=+0.127891513 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:17:49 compute-0 nova_compute[187212]: 2025-11-25 19:17:49.554 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:52 compute-0 nova_compute[187212]: 2025-11-25 19:17:52.308 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:54 compute-0 nova_compute[187212]: 2025-11-25 19:17:54.596 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:57 compute-0 nova_compute[187212]: 2025-11-25 19:17:57.310 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:58 compute-0 ovn_controller[95465]: 2025-11-25T19:17:58Z|00100|memory_trim|INFO|Detected inactivity (last active 30025 ms ago): trimming memory
Nov 25 19:17:59 compute-0 podman[213191]: 2025-11-25 19:17:59.166633161 +0000 UTC m=+0.084374955 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:17:59 compute-0 nova_compute[187212]: 2025-11-25 19:17:59.636 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:17:59 compute-0 podman[197585]: time="2025-11-25T19:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:17:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:17:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3087 "" "Go-http-client/1.1"
Nov 25 19:18:01 compute-0 openstack_network_exporter[199731]: ERROR   19:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:18:01 compute-0 openstack_network_exporter[199731]: ERROR   19:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:18:01 compute-0 openstack_network_exporter[199731]: ERROR   19:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:18:01 compute-0 openstack_network_exporter[199731]: ERROR   19:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:18:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:18:01 compute-0 openstack_network_exporter[199731]: ERROR   19:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:18:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:18:02 compute-0 nova_compute[187212]: 2025-11-25 19:18:02.359 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:03 compute-0 nova_compute[187212]: 2025-11-25 19:18:03.833 187216 DEBUG nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Creating tmpfile /var/lib/nova/instances/tmpek0nm0zs to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Nov 25 19:18:03 compute-0 nova_compute[187212]: 2025-11-25 19:18:03.834 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:03 compute-0 nova_compute[187212]: 2025-11-25 19:18:03.850 187216 DEBUG nova.compute.manager [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpek0nm0zs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Nov 25 19:18:04 compute-0 nova_compute[187212]: 2025-11-25 19:18:04.689 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:05 compute-0 nova_compute[187212]: 2025-11-25 19:18:05.899 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:06 compute-0 podman[213217]: 2025-11-25 19:18:06.219219894 +0000 UTC m=+0.128138680 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:18:07 compute-0 nova_compute[187212]: 2025-11-25 19:18:07.361 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:09 compute-0 podman[213244]: 2025-11-25 19:18:09.159628606 +0000 UTC m=+0.075916524 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 25 19:18:09 compute-0 nova_compute[187212]: 2025-11-25 19:18:09.692 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:10 compute-0 nova_compute[187212]: 2025-11-25 19:18:10.319 187216 DEBUG nova.compute.manager [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpek0nm0zs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='32736a79-cfc3-48ac-a561-ab16cd576f9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Nov 25 19:18:11 compute-0 nova_compute[187212]: 2025-11-25 19:18:11.337 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-32736a79-cfc3-48ac-a561-ab16cd576f9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:18:11 compute-0 nova_compute[187212]: 2025-11-25 19:18:11.338 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-32736a79-cfc3-48ac-a561-ab16cd576f9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:18:11 compute-0 nova_compute[187212]: 2025-11-25 19:18:11.338 187216 DEBUG nova.network.neutron [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:18:11 compute-0 nova_compute[187212]: 2025-11-25 19:18:11.849 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:12 compute-0 nova_compute[187212]: 2025-11-25 19:18:12.363 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:12 compute-0 nova_compute[187212]: 2025-11-25 19:18:12.752 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:12 compute-0 nova_compute[187212]: 2025-11-25 19:18:12.922 187216 DEBUG nova.network.neutron [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Updating instance_info_cache with network_info: [{"id": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "address": "fa:16:3e:a5:4c:c5", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf479a66-03", "ovs_interfaceid": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:18:13 compute-0 nova_compute[187212]: 2025-11-25 19:18:13.430 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-32736a79-cfc3-48ac-a561-ab16cd576f9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:18:13 compute-0 nova_compute[187212]: 2025-11-25 19:18:13.446 187216 DEBUG nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpek0nm0zs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='32736a79-cfc3-48ac-a561-ab16cd576f9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Nov 25 19:18:13 compute-0 nova_compute[187212]: 2025-11-25 19:18:13.447 187216 DEBUG nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Creating instance directory: /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Nov 25 19:18:13 compute-0 nova_compute[187212]: 2025-11-25 19:18:13.447 187216 DEBUG nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Creating disk.info with the contents: {'/var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk': 'qcow2', '/var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Nov 25 19:18:13 compute-0 nova_compute[187212]: 2025-11-25 19:18:13.448 187216 DEBUG nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Nov 25 19:18:13 compute-0 nova_compute[187212]: 2025-11-25 19:18:13.449 187216 DEBUG nova.objects.instance [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 32736a79-cfc3-48ac-a561-ab16cd576f9c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:18:13 compute-0 nova_compute[187212]: 2025-11-25 19:18:13.955 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:18:13 compute-0 nova_compute[187212]: 2025-11-25 19:18:13.960 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:18:13 compute-0 nova_compute[187212]: 2025-11-25 19:18:13.962 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.064 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.065 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.066 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.067 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.073 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.074 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.163 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.164 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.209 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.211 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.212 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.300 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.301 187216 DEBUG nova.virt.disk.api [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Checking if we can resize image /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.301 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.371 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.372 187216 DEBUG nova.virt.disk.api [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Cannot resize image /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.373 187216 DEBUG nova.objects.instance [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'migration_context' on Instance uuid 32736a79-cfc3-48ac-a561-ab16cd576f9c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.695 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.882 187216 DEBUG nova.objects.base [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Object Instance<32736a79-cfc3-48ac-a561-ab16cd576f9c> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.883 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.925 187216 DEBUG oslo_concurrency.processutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk.config 497664" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.927 187216 DEBUG nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.929 187216 DEBUG nova.virt.libvirt.vif [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-11-25T19:16:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-740450704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-740450704',id=10,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:17:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='492e2572d9d24bf29c79f2b1c9dab462',ramdisk_id='',reservation_id='r-7v9fdigj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-625591926',owner_user_name='tempest-TestExecuteBasicStrategy-625591926-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:17:07Z,user_data=None,user_id='81221121fff040f195867ab25b59f26e',uuid=32736a79-cfc3-48ac-a561-ab16cd576f9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "address": "fa:16:3e:a5:4c:c5", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbf479a66-03", "ovs_interfaceid": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.930 187216 DEBUG nova.network.os_vif_util [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "address": "fa:16:3e:a5:4c:c5", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbf479a66-03", "ovs_interfaceid": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.932 187216 DEBUG nova.network.os_vif_util [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4c:c5,bridge_name='br-int',has_traffic_filtering=True,id=bf479a66-0387-4d8d-81f9-8ad5c43f7122,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf479a66-03') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.932 187216 DEBUG os_vif [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4c:c5,bridge_name='br-int',has_traffic_filtering=True,id=bf479a66-0387-4d8d-81f9-8ad5c43f7122,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf479a66-03') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.934 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.935 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.936 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.937 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.938 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8b977443-45ee-5d77-9a04-b158bb1cfd4c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.940 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.943 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.948 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.949 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf479a66-03, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.950 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbf479a66-03, col_values=(('qos', UUID('c7073440-a0ff-47bc-a92d-538618e4cf91')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.951 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbf479a66-03, col_values=(('external_ids', {'iface-id': 'bf479a66-0387-4d8d-81f9-8ad5c43f7122', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:4c:c5', 'vm-uuid': '32736a79-cfc3-48ac-a561-ab16cd576f9c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.953 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:14 compute-0 NetworkManager[55552]: <info>  [1764098294.9556] manager: (tapbf479a66-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.957 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.964 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.965 187216 INFO os_vif [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:4c:c5,bridge_name='br-int',has_traffic_filtering=True,id=bf479a66-0387-4d8d-81f9-8ad5c43f7122,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf479a66-03')
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.966 187216 DEBUG nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.966 187216 DEBUG nova.compute.manager [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpek0nm0zs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='32736a79-cfc3-48ac-a561-ab16cd576f9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Nov 25 19:18:14 compute-0 nova_compute[187212]: 2025-11-25 19:18:14.967 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:15 compute-0 podman[213285]: 2025-11-25 19:18:15.151982154 +0000 UTC m=+0.080261473 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 19:18:15 compute-0 nova_compute[187212]: 2025-11-25 19:18:15.301 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:15.674 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:18:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:15.675 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:18:15 compute-0 nova_compute[187212]: 2025-11-25 19:18:15.677 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:16 compute-0 nova_compute[187212]: 2025-11-25 19:18:16.108 187216 DEBUG nova.network.neutron [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Port bf479a66-0387-4d8d-81f9-8ad5c43f7122 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Nov 25 19:18:16 compute-0 nova_compute[187212]: 2025-11-25 19:18:16.123 187216 DEBUG nova.compute.manager [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpek0nm0zs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='32736a79-cfc3-48ac-a561-ab16cd576f9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Nov 25 19:18:16 compute-0 nova_compute[187212]: 2025-11-25 19:18:16.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:18:16 compute-0 nova_compute[187212]: 2025-11-25 19:18:16.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:18:17 compute-0 nova_compute[187212]: 2025-11-25 19:18:17.365 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:19 compute-0 podman[213308]: 2025-11-25 19:18:19.165515022 +0000 UTC m=+0.082816222 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:18:19 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 25 19:18:19 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 25 19:18:19 compute-0 kernel: tapbf479a66-03: entered promiscuous mode
Nov 25 19:18:19 compute-0 NetworkManager[55552]: <info>  [1764098299.6411] manager: (tapbf479a66-03): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Nov 25 19:18:19 compute-0 nova_compute[187212]: 2025-11-25 19:18:19.642 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:19 compute-0 nova_compute[187212]: 2025-11-25 19:18:19.649 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:19 compute-0 ovn_controller[95465]: 2025-11-25T19:18:19Z|00101|binding|INFO|Claiming lport bf479a66-0387-4d8d-81f9-8ad5c43f7122 for this additional chassis.
Nov 25 19:18:19 compute-0 ovn_controller[95465]: 2025-11-25T19:18:19Z|00102|binding|INFO|bf479a66-0387-4d8d-81f9-8ad5c43f7122: Claiming fa:16:3e:a5:4c:c5 10.100.0.9
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.660 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:4c:c5 10.100.0.9'], port_security=['fa:16:3e:a5:4c:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '32736a79-cfc3-48ac-a561-ab16cd576f9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '492e2572d9d24bf29c79f2b1c9dab462', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'e9615ad8-5145-4bd7-be4d-76e46400d6dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=874f2765-a6cf-42e0-8c42-359508942dc4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=bf479a66-0387-4d8d-81f9-8ad5c43f7122) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.661 104356 INFO neutron.agent.ovn.metadata.agent [-] Port bf479a66-0387-4d8d-81f9-8ad5c43f7122 in datapath d59a13e0-0444-4801-a38d-c9b9b692bc71 unbound from our chassis
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.663 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d59a13e0-0444-4801-a38d-c9b9b692bc71
Nov 25 19:18:19 compute-0 ovn_controller[95465]: 2025-11-25T19:18:19Z|00103|binding|INFO|Setting lport bf479a66-0387-4d8d-81f9-8ad5c43f7122 ovn-installed in OVS
Nov 25 19:18:19 compute-0 nova_compute[187212]: 2025-11-25 19:18:19.679 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.690 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[98e1e14c-14ef-46d4-b0ed-68a7c27e4ad4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:19 compute-0 systemd-udevd[213365]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:18:19 compute-0 systemd-machined[153494]: New machine qemu-9-instance-0000000a.
Nov 25 19:18:19 compute-0 NetworkManager[55552]: <info>  [1764098299.7133] device (tapbf479a66-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:18:19 compute-0 NetworkManager[55552]: <info>  [1764098299.7157] device (tapbf479a66-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:18:19 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000a.
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.737 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[9565bf5a-13aa-43fe-b2a3-c09752db287b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.741 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3af4bf-307a-4dc5-a61b-7cfcc49ea076]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.787 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[fac43d52-0332-4c11-a288-159b09d1eeab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.815 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae1640b-5024-469e-832a-776893b817e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd59a13e0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:69:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419553, 'reachable_time': 41528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213377, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.839 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[4c23d021-67dd-46d1-b7d2-932be5a8b096]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd59a13e0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 419569, 'tstamp': 419569}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213378, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd59a13e0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 419573, 'tstamp': 419573}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213378, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.843 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd59a13e0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:19 compute-0 nova_compute[187212]: 2025-11-25 19:18:19.845 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:19 compute-0 nova_compute[187212]: 2025-11-25 19:18:19.846 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.847 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd59a13e0-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.847 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.848 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd59a13e0-00, col_values=(('external_ids', {'iface-id': '0c24d2e0-449b-4c43-8064-1c741005deb8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.848 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:18:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:19.851 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c61148-3cf8-41d3-9977-2f5ac2c7ca0e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d59a13e0-0444-4801-a38d-c9b9b692bc71\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d59a13e0-0444-4801-a38d-c9b9b692bc71\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:19 compute-0 nova_compute[187212]: 2025-11-25 19:18:19.953 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:21 compute-0 nova_compute[187212]: 2025-11-25 19:18:21.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:18:22 compute-0 nova_compute[187212]: 2025-11-25 19:18:22.409 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:23 compute-0 nova_compute[187212]: 2025-11-25 19:18:23.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:18:23 compute-0 nova_compute[187212]: 2025-11-25 19:18:23.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:18:23 compute-0 ovn_controller[95465]: 2025-11-25T19:18:23Z|00104|binding|INFO|Claiming lport bf479a66-0387-4d8d-81f9-8ad5c43f7122 for this chassis.
Nov 25 19:18:23 compute-0 ovn_controller[95465]: 2025-11-25T19:18:23Z|00105|binding|INFO|bf479a66-0387-4d8d-81f9-8ad5c43f7122: Claiming fa:16:3e:a5:4c:c5 10.100.0.9
Nov 25 19:18:23 compute-0 ovn_controller[95465]: 2025-11-25T19:18:23Z|00106|binding|INFO|Setting lport bf479a66-0387-4d8d-81f9-8ad5c43f7122 up in Southbound
Nov 25 19:18:23 compute-0 nova_compute[187212]: 2025-11-25 19:18:23.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:23 compute-0 nova_compute[187212]: 2025-11-25 19:18:23.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:23 compute-0 nova_compute[187212]: 2025-11-25 19:18:23.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:23 compute-0 nova_compute[187212]: 2025-11-25 19:18:23.689 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.400 187216 INFO nova.compute.manager [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Post operation of migration started
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.401 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.557 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.558 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.674 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-32736a79-cfc3-48ac-a561-ab16cd576f9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.675 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-32736a79-cfc3-48ac-a561-ab16cd576f9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.676 187216 DEBUG nova.network.neutron [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.743 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.832 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.834 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.901 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.909 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.956 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.973 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:24 compute-0 nova_compute[187212]: 2025-11-25 19:18:24.974 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:25 compute-0 nova_compute[187212]: 2025-11-25 19:18:25.057 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:25 compute-0 nova_compute[187212]: 2025-11-25 19:18:25.184 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:25 compute-0 nova_compute[187212]: 2025-11-25 19:18:25.315 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:18:25 compute-0 nova_compute[187212]: 2025-11-25 19:18:25.317 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:18:25 compute-0 nova_compute[187212]: 2025-11-25 19:18:25.349 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:18:25 compute-0 nova_compute[187212]: 2025-11-25 19:18:25.350 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5451MB free_disk=72.93885040283203GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:18:25 compute-0 nova_compute[187212]: 2025-11-25 19:18:25.351 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:25 compute-0 nova_compute[187212]: 2025-11-25 19:18:25.351 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:25 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:25.676 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:26 compute-0 nova_compute[187212]: 2025-11-25 19:18:26.273 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:26 compute-0 nova_compute[187212]: 2025-11-25 19:18:26.377 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Migration for instance 32736a79-cfc3-48ac-a561-ab16cd576f9c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Nov 25 19:18:26 compute-0 nova_compute[187212]: 2025-11-25 19:18:26.498 187216 DEBUG nova.network.neutron [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Updating instance_info_cache with network_info: [{"id": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "address": "fa:16:3e:a5:4c:c5", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf479a66-03", "ovs_interfaceid": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:18:26 compute-0 nova_compute[187212]: 2025-11-25 19:18:26.886 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Updating resource usage from migration 5a3651bb-167c-4bd8-bf1c-e660cc38974b
Nov 25 19:18:26 compute-0 nova_compute[187212]: 2025-11-25 19:18:26.886 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Starting to track incoming migration 5a3651bb-167c-4bd8-bf1c-e660cc38974b with flavor d7d5bae9-10ca-4750-9d69-ce73a869da56 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Nov 25 19:18:27 compute-0 nova_compute[187212]: 2025-11-25 19:18:27.006 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-32736a79-cfc3-48ac-a561-ab16cd576f9c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:18:27 compute-0 nova_compute[187212]: 2025-11-25 19:18:27.412 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:27 compute-0 nova_compute[187212]: 2025-11-25 19:18:27.432 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 868f271d-a554-4ac7-8fc5-de42da59e9f4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:18:27 compute-0 nova_compute[187212]: 2025-11-25 19:18:27.536 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:27 compute-0 nova_compute[187212]: 2025-11-25 19:18:27.941 187216 WARNING nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 32736a79-cfc3-48ac-a561-ab16cd576f9c has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Nov 25 19:18:27 compute-0 nova_compute[187212]: 2025-11-25 19:18:27.942 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:18:27 compute-0 nova_compute[187212]: 2025-11-25 19:18:27.942 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:18:25 up  1:10,  0 user,  load average: 0.45, 0.53, 0.51\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_492e2572d9d24bf29c79f2b1c9dab462': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:18:28 compute-0 nova_compute[187212]: 2025-11-25 19:18:28.020 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:18:28 compute-0 nova_compute[187212]: 2025-11-25 19:18:28.530 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:18:29 compute-0 nova_compute[187212]: 2025-11-25 19:18:29.047 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:18:29 compute-0 nova_compute[187212]: 2025-11-25 19:18:29.048 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.697s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:29 compute-0 nova_compute[187212]: 2025-11-25 19:18:29.049 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 1.513s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:29 compute-0 nova_compute[187212]: 2025-11-25 19:18:29.049 187216 DEBUG oslo_concurrency.lockutils [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:29 compute-0 nova_compute[187212]: 2025-11-25 19:18:29.055 187216 INFO nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 25 19:18:29 compute-0 virtqemud[186888]: Domain id=9 name='instance-0000000a' uuid=32736a79-cfc3-48ac-a561-ab16cd576f9c is tainted: custom-monitor
Nov 25 19:18:29 compute-0 podman[197585]: time="2025-11-25T19:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:18:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Nov 25 19:18:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3090 "" "Go-http-client/1.1"
Nov 25 19:18:29 compute-0 nova_compute[187212]: 2025-11-25 19:18:29.959 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:30 compute-0 nova_compute[187212]: 2025-11-25 19:18:30.045 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:18:30 compute-0 nova_compute[187212]: 2025-11-25 19:18:30.047 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:18:30 compute-0 nova_compute[187212]: 2025-11-25 19:18:30.064 187216 INFO nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 25 19:18:30 compute-0 nova_compute[187212]: 2025-11-25 19:18:30.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:18:30 compute-0 nova_compute[187212]: 2025-11-25 19:18:30.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:18:30 compute-0 podman[213414]: 2025-11-25 19:18:30.188003691 +0000 UTC m=+0.101446311 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:18:31 compute-0 nova_compute[187212]: 2025-11-25 19:18:31.073 187216 INFO nova.virt.libvirt.driver [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 25 19:18:31 compute-0 nova_compute[187212]: 2025-11-25 19:18:31.080 187216 DEBUG nova.compute.manager [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:18:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:31.090 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:31.091 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:31.091 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:31 compute-0 openstack_network_exporter[199731]: ERROR   19:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:18:31 compute-0 openstack_network_exporter[199731]: ERROR   19:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:18:31 compute-0 openstack_network_exporter[199731]: ERROR   19:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:18:31 compute-0 openstack_network_exporter[199731]: ERROR   19:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:18:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:18:31 compute-0 openstack_network_exporter[199731]: ERROR   19:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:18:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:18:31 compute-0 nova_compute[187212]: 2025-11-25 19:18:31.592 187216 DEBUG nova.objects.instance [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Nov 25 19:18:32 compute-0 nova_compute[187212]: 2025-11-25 19:18:32.449 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:32 compute-0 nova_compute[187212]: 2025-11-25 19:18:32.615 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:33 compute-0 nova_compute[187212]: 2025-11-25 19:18:33.295 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:33 compute-0 nova_compute[187212]: 2025-11-25 19:18:33.296 187216 WARNING neutronclient.v2_0.client [None req-04ee0304-5cac-4366-afb4-14375b83832d 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:34 compute-0 nova_compute[187212]: 2025-11-25 19:18:34.962 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:37 compute-0 podman[213440]: 2025-11-25 19:18:37.277061441 +0000 UTC m=+0.190118393 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Nov 25 19:18:37 compute-0 nova_compute[187212]: 2025-11-25 19:18:37.451 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.270 187216 DEBUG oslo_concurrency.lockutils [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "868f271d-a554-4ac7-8fc5-de42da59e9f4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.271 187216 DEBUG oslo_concurrency.lockutils [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.271 187216 DEBUG oslo_concurrency.lockutils [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.271 187216 DEBUG oslo_concurrency.lockutils [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.272 187216 DEBUG oslo_concurrency.lockutils [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.289 187216 INFO nova.compute.manager [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Terminating instance
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.810 187216 DEBUG nova.compute.manager [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:18:38 compute-0 kernel: tap590629bf-d0 (unregistering): left promiscuous mode
Nov 25 19:18:38 compute-0 NetworkManager[55552]: <info>  [1764098318.8559] device (tap590629bf-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.863 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:38 compute-0 ovn_controller[95465]: 2025-11-25T19:18:38Z|00107|binding|INFO|Releasing lport 590629bf-d033-4f3a-a3d8-5eb92ceecde0 from this chassis (sb_readonly=0)
Nov 25 19:18:38 compute-0 ovn_controller[95465]: 2025-11-25T19:18:38Z|00108|binding|INFO|Setting lport 590629bf-d033-4f3a-a3d8-5eb92ceecde0 down in Southbound
Nov 25 19:18:38 compute-0 ovn_controller[95465]: 2025-11-25T19:18:38Z|00109|binding|INFO|Removing iface tap590629bf-d0 ovn-installed in OVS
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.867 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:38.874 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:a6:bc 10.100.0.4'], port_security=['fa:16:3e:f2:a6:bc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '868f271d-a554-4ac7-8fc5-de42da59e9f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '492e2572d9d24bf29c79f2b1c9dab462', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e9615ad8-5145-4bd7-be4d-76e46400d6dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=874f2765-a6cf-42e0-8c42-359508942dc4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=590629bf-d033-4f3a-a3d8-5eb92ceecde0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:18:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:38.876 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 590629bf-d033-4f3a-a3d8-5eb92ceecde0 in datapath d59a13e0-0444-4801-a38d-c9b9b692bc71 unbound from our chassis
Nov 25 19:18:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:38.879 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d59a13e0-0444-4801-a38d-c9b9b692bc71
Nov 25 19:18:38 compute-0 nova_compute[187212]: 2025-11-25 19:18:38.895 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:38.909 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[91a6e0a1-9455-42ad-8b40-2904187f137a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:38 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 25 19:18:38 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Consumed 16.359s CPU time.
Nov 25 19:18:38 compute-0 systemd-machined[153494]: Machine qemu-8-instance-0000000b terminated.
Nov 25 19:18:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:38.964 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[c8106ac3-c12f-42aa-b8af-8c020f6d8041]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:38 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:38.967 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[7d002809-636d-42a2-9109-c4546d3df807]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:39.017 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[a4c66d1c-334f-4586-bfb6-725981acced5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:39.046 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[eed323f1-f321-4a2c-ad89-cf93c6973030]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd59a13e0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3c:69:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419553, 'reachable_time': 41528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213482, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:39.073 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[18686036-9fcd-428d-90c7-d428fc2ed0b7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd59a13e0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 419569, 'tstamp': 419569}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213489, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd59a13e0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 419573, 'tstamp': 419573}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213489, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:39.075 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd59a13e0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.078 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.085 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:39.086 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd59a13e0-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:39.086 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:18:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:39.087 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd59a13e0-00, col_values=(('external_ids', {'iface-id': '0c24d2e0-449b-4c43-8064-1c741005deb8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:39.087 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:18:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:39.089 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[13fba083-41b9-49f9-87c5-32caa9101a77]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d59a13e0-0444-4801-a38d-c9b9b692bc71\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d59a13e0-0444-4801-a38d-c9b9b692bc71\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.109 187216 DEBUG nova.compute.manager [req-d05f2087-ad03-4a32-9757-b029d8c36b9f req-1a8f5cc6-7517-4110-939f-42d34ef3e066 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Received event network-vif-unplugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.110 187216 DEBUG oslo_concurrency.lockutils [req-d05f2087-ad03-4a32-9757-b029d8c36b9f req-1a8f5cc6-7517-4110-939f-42d34ef3e066 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.110 187216 DEBUG oslo_concurrency.lockutils [req-d05f2087-ad03-4a32-9757-b029d8c36b9f req-1a8f5cc6-7517-4110-939f-42d34ef3e066 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.111 187216 DEBUG oslo_concurrency.lockutils [req-d05f2087-ad03-4a32-9757-b029d8c36b9f req-1a8f5cc6-7517-4110-939f-42d34ef3e066 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.111 187216 DEBUG nova.compute.manager [req-d05f2087-ad03-4a32-9757-b029d8c36b9f req-1a8f5cc6-7517-4110-939f-42d34ef3e066 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] No waiting events found dispatching network-vif-unplugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.111 187216 DEBUG nova.compute.manager [req-d05f2087-ad03-4a32-9757-b029d8c36b9f req-1a8f5cc6-7517-4110-939f-42d34ef3e066 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Received event network-vif-unplugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.114 187216 INFO nova.virt.libvirt.driver [-] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Instance destroyed successfully.
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.114 187216 DEBUG nova.objects.instance [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lazy-loading 'resources' on Instance uuid 868f271d-a554-4ac7-8fc5-de42da59e9f4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.624 187216 DEBUG nova.virt.libvirt.vif [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1434666508',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1434666508',id=11,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:17:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='492e2572d9d24bf29c79f2b1c9dab462',ramdisk_id='',reservation_id='r-06y2bo17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-625591926',owner_user_name='tempest-TestExecuteBasicStrategy-625591926-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:17:29Z,user_data=None,user_id='81221121fff040f195867ab25b59f26e',uuid=868f271d-a554-4ac7-8fc5-de42da59e9f4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.625 187216 DEBUG nova.network.os_vif_util [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Converting VIF {"id": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "address": "fa:16:3e:f2:a6:bc", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap590629bf-d0", "ovs_interfaceid": "590629bf-d033-4f3a-a3d8-5eb92ceecde0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.626 187216 DEBUG nova.network.os_vif_util [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:a6:bc,bridge_name='br-int',has_traffic_filtering=True,id=590629bf-d033-4f3a-a3d8-5eb92ceecde0,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap590629bf-d0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.626 187216 DEBUG os_vif [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:a6:bc,bridge_name='br-int',has_traffic_filtering=True,id=590629bf-d033-4f3a-a3d8-5eb92ceecde0,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap590629bf-d0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.629 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.629 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap590629bf-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.675 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.676 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.677 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.677 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=91be79bb-02fa-4afb-9d91-eb40c8281495) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.678 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.679 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.680 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.683 187216 INFO os_vif [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:a6:bc,bridge_name='br-int',has_traffic_filtering=True,id=590629bf-d033-4f3a-a3d8-5eb92ceecde0,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap590629bf-d0')
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.683 187216 INFO nova.virt.libvirt.driver [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Deleting instance files /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4_del
Nov 25 19:18:39 compute-0 nova_compute[187212]: 2025-11-25 19:18:39.684 187216 INFO nova.virt.libvirt.driver [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Deletion of /var/lib/nova/instances/868f271d-a554-4ac7-8fc5-de42da59e9f4_del complete
Nov 25 19:18:40 compute-0 podman[213501]: 2025-11-25 19:18:40.166820269 +0000 UTC m=+0.085547338 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:18:40 compute-0 nova_compute[187212]: 2025-11-25 19:18:40.199 187216 INFO nova.compute.manager [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Took 1.39 seconds to destroy the instance on the hypervisor.
Nov 25 19:18:40 compute-0 nova_compute[187212]: 2025-11-25 19:18:40.200 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:18:40 compute-0 nova_compute[187212]: 2025-11-25 19:18:40.200 187216 DEBUG nova.compute.manager [-] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:18:40 compute-0 nova_compute[187212]: 2025-11-25 19:18:40.200 187216 DEBUG nova.network.neutron [-] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:18:40 compute-0 nova_compute[187212]: 2025-11-25 19:18:40.201 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:40 compute-0 nova_compute[187212]: 2025-11-25 19:18:40.324 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.203 187216 DEBUG nova.network.neutron [-] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.222 187216 DEBUG nova.compute.manager [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Received event network-vif-unplugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.223 187216 DEBUG oslo_concurrency.lockutils [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.223 187216 DEBUG oslo_concurrency.lockutils [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.223 187216 DEBUG oslo_concurrency.lockutils [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.223 187216 DEBUG nova.compute.manager [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] No waiting events found dispatching network-vif-unplugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.224 187216 DEBUG nova.compute.manager [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Received event network-vif-unplugged-590629bf-d033-4f3a-a3d8-5eb92ceecde0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.224 187216 DEBUG nova.compute.manager [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Received event network-vif-deleted-590629bf-d033-4f3a-a3d8-5eb92ceecde0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.224 187216 INFO nova.compute.manager [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Neutron deleted interface 590629bf-d033-4f3a-a3d8-5eb92ceecde0; detaching it from the instance and deleting it from the info cache
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.224 187216 DEBUG nova.network.neutron [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.710 187216 INFO nova.compute.manager [-] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Took 1.51 seconds to deallocate network for instance.
Nov 25 19:18:41 compute-0 nova_compute[187212]: 2025-11-25 19:18:41.735 187216 DEBUG nova.compute.manager [req-97f08204-24f1-49b5-a613-fa9c2b90e951 req-275ce6c5-d3e0-4e8a-a230-6f8c2b36f85b 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 868f271d-a554-4ac7-8fc5-de42da59e9f4] Detach interface failed, port_id=590629bf-d033-4f3a-a3d8-5eb92ceecde0, reason: Instance 868f271d-a554-4ac7-8fc5-de42da59e9f4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:18:42 compute-0 nova_compute[187212]: 2025-11-25 19:18:42.242 187216 DEBUG oslo_concurrency.lockutils [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:42 compute-0 nova_compute[187212]: 2025-11-25 19:18:42.242 187216 DEBUG oslo_concurrency.lockutils [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:42 compute-0 nova_compute[187212]: 2025-11-25 19:18:42.319 187216 DEBUG nova.compute.provider_tree [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:18:42 compute-0 nova_compute[187212]: 2025-11-25 19:18:42.453 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:42 compute-0 nova_compute[187212]: 2025-11-25 19:18:42.827 187216 DEBUG nova.scheduler.client.report [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:18:43 compute-0 nova_compute[187212]: 2025-11-25 19:18:43.338 187216 DEBUG oslo_concurrency.lockutils [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:43 compute-0 nova_compute[187212]: 2025-11-25 19:18:43.372 187216 INFO nova.scheduler.client.report [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Deleted allocations for instance 868f271d-a554-4ac7-8fc5-de42da59e9f4
Nov 25 19:18:44 compute-0 nova_compute[187212]: 2025-11-25 19:18:44.406 187216 DEBUG oslo_concurrency.lockutils [None req-01e7612a-72c3-422c-a52e-d72b135417f3 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "868f271d-a554-4ac7-8fc5-de42da59e9f4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.135s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:44 compute-0 nova_compute[187212]: 2025-11-25 19:18:44.722 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.119 187216 DEBUG oslo_concurrency.lockutils [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "32736a79-cfc3-48ac-a561-ab16cd576f9c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.120 187216 DEBUG oslo_concurrency.lockutils [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "32736a79-cfc3-48ac-a561-ab16cd576f9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.120 187216 DEBUG oslo_concurrency.lockutils [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "32736a79-cfc3-48ac-a561-ab16cd576f9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.120 187216 DEBUG oslo_concurrency.lockutils [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "32736a79-cfc3-48ac-a561-ab16cd576f9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.121 187216 DEBUG oslo_concurrency.lockutils [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "32736a79-cfc3-48ac-a561-ab16cd576f9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.135 187216 INFO nova.compute.manager [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Terminating instance
Nov 25 19:18:46 compute-0 podman[213520]: 2025-11-25 19:18:46.171582145 +0000 UTC m=+0.094025859 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64)
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.658 187216 DEBUG nova.compute.manager [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:18:46 compute-0 kernel: tapbf479a66-03 (unregistering): left promiscuous mode
Nov 25 19:18:46 compute-0 NetworkManager[55552]: <info>  [1764098326.6845] device (tapbf479a66-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:18:46 compute-0 ovn_controller[95465]: 2025-11-25T19:18:46Z|00110|binding|INFO|Releasing lport bf479a66-0387-4d8d-81f9-8ad5c43f7122 from this chassis (sb_readonly=0)
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.693 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:46 compute-0 ovn_controller[95465]: 2025-11-25T19:18:46Z|00111|binding|INFO|Setting lport bf479a66-0387-4d8d-81f9-8ad5c43f7122 down in Southbound
Nov 25 19:18:46 compute-0 ovn_controller[95465]: 2025-11-25T19:18:46Z|00112|binding|INFO|Removing iface tapbf479a66-03 ovn-installed in OVS
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.697 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:46 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:46.703 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:4c:c5 10.100.0.9'], port_security=['fa:16:3e:a5:4c:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '32736a79-cfc3-48ac-a561-ab16cd576f9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '492e2572d9d24bf29c79f2b1c9dab462', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'e9615ad8-5145-4bd7-be4d-76e46400d6dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=874f2765-a6cf-42e0-8c42-359508942dc4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=bf479a66-0387-4d8d-81f9-8ad5c43f7122) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:18:46 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:46.704 104356 INFO neutron.agent.ovn.metadata.agent [-] Port bf479a66-0387-4d8d-81f9-8ad5c43f7122 in datapath d59a13e0-0444-4801-a38d-c9b9b692bc71 unbound from our chassis
Nov 25 19:18:46 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:46.707 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d59a13e0-0444-4801-a38d-c9b9b692bc71, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:18:46 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:46.708 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1c3847-f09d-4bcc-a06c-78d8b7030364]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:46 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:46.708 104356 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71 namespace which is not needed anymore
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.725 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:46 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 25 19:18:46 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000a.scope: Consumed 2.619s CPU time.
Nov 25 19:18:46 compute-0 systemd-machined[153494]: Machine qemu-9-instance-0000000a terminated.
Nov 25 19:18:46 compute-0 neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71[213057]: [NOTICE]   (213078) : haproxy version is 3.0.5-8e879a5
Nov 25 19:18:46 compute-0 neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71[213057]: [NOTICE]   (213078) : path to executable is /usr/sbin/haproxy
Nov 25 19:18:46 compute-0 neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71[213057]: [WARNING]  (213078) : Exiting Master process...
Nov 25 19:18:46 compute-0 podman[213566]: 2025-11-25 19:18:46.897244613 +0000 UTC m=+0.044666141 container kill a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:18:46 compute-0 neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71[213057]: [ALERT]    (213078) : Current worker (213080) exited with code 143 (Terminated)
Nov 25 19:18:46 compute-0 neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71[213057]: [WARNING]  (213078) : All workers exited. Exiting... (0)
Nov 25 19:18:46 compute-0 systemd[1]: libpod-a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7.scope: Deactivated successfully.
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.931 187216 DEBUG nova.compute.manager [req-e4fff009-b8c8-4ead-8bec-648ec62cdff9 req-0548bfa4-7811-4c55-ba05-7815efbc9b1d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Received event network-vif-unplugged-bf479a66-0387-4d8d-81f9-8ad5c43f7122 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.931 187216 DEBUG oslo_concurrency.lockutils [req-e4fff009-b8c8-4ead-8bec-648ec62cdff9 req-0548bfa4-7811-4c55-ba05-7815efbc9b1d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "32736a79-cfc3-48ac-a561-ab16cd576f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.932 187216 DEBUG oslo_concurrency.lockutils [req-e4fff009-b8c8-4ead-8bec-648ec62cdff9 req-0548bfa4-7811-4c55-ba05-7815efbc9b1d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "32736a79-cfc3-48ac-a561-ab16cd576f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.932 187216 DEBUG oslo_concurrency.lockutils [req-e4fff009-b8c8-4ead-8bec-648ec62cdff9 req-0548bfa4-7811-4c55-ba05-7815efbc9b1d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "32736a79-cfc3-48ac-a561-ab16cd576f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.932 187216 DEBUG nova.compute.manager [req-e4fff009-b8c8-4ead-8bec-648ec62cdff9 req-0548bfa4-7811-4c55-ba05-7815efbc9b1d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] No waiting events found dispatching network-vif-unplugged-bf479a66-0387-4d8d-81f9-8ad5c43f7122 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.933 187216 DEBUG nova.compute.manager [req-e4fff009-b8c8-4ead-8bec-648ec62cdff9 req-0548bfa4-7811-4c55-ba05-7815efbc9b1d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Received event network-vif-unplugged-bf479a66-0387-4d8d-81f9-8ad5c43f7122 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.940 187216 INFO nova.virt.libvirt.driver [-] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Instance destroyed successfully.
Nov 25 19:18:46 compute-0 nova_compute[187212]: 2025-11-25 19:18:46.940 187216 DEBUG nova.objects.instance [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lazy-loading 'resources' on Instance uuid 32736a79-cfc3-48ac-a561-ab16cd576f9c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:18:46 compute-0 podman[213592]: 2025-11-25 19:18:46.963660417 +0000 UTC m=+0.036503398 container died a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Nov 25 19:18:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7-userdata-shm.mount: Deactivated successfully.
Nov 25 19:18:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-8700569634fb5e954051e483acde542e5495357c6a4b6d321315509d2f9f6721-merged.mount: Deactivated successfully.
Nov 25 19:18:46 compute-0 podman[213592]: 2025-11-25 19:18:46.997799469 +0000 UTC m=+0.070642480 container cleanup a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Nov 25 19:18:47 compute-0 systemd[1]: libpod-conmon-a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7.scope: Deactivated successfully.
Nov 25 19:18:47 compute-0 podman[213597]: 2025-11-25 19:18:47.013763525 +0000 UTC m=+0.066661242 container remove a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.019 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba2d923-3086-4170-8942-9fcea4914521]: (4, ("Tue Nov 25 07:18:46 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71 (a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7)\na0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7\nTue Nov 25 07:18:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71 (a0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7)\na0c9ed31747eefe6abd4163be784936abd5b3b887fcae4664eb80daa7eb12ed7\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.021 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[38eb47f2-893c-476a-8474-d65eacaeb4c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.021 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d59a13e0-0444-4801-a38d-c9b9b692bc71.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.022 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[39ad82c3-f796-4dd6-affd-a9f9613bb0ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.023 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd59a13e0-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.055 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:47 compute-0 kernel: tapd59a13e0-00: left promiscuous mode
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.072 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.075 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[7ddd7f7f-a584-47b5-9721-ff3941a30ad2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.091 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[d18e64a0-a452-491c-a36a-1f24a222dc6b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.092 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[f72589b1-902e-4515-b1c4-ca45a4157fed]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.116 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[dc269812-8750-4050-905f-946526e8e7d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419542, 'reachable_time': 34089, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213632, 'error': None, 'target': 'ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.124 104475 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d59a13e0-0444-4801-a38d-c9b9b692bc71 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 19:18:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:18:47.125 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[5a3a4fc1-5ac6-49cd-a6b0-1b4fcd67a0e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:18:47 compute-0 systemd[1]: run-netns-ovnmeta\x2dd59a13e0\x2d0444\x2d4801\x2da38d\x2dc9b9b692bc71.mount: Deactivated successfully.
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.448 187216 DEBUG nova.virt.libvirt.vif [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2025-11-25T19:16:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-740450704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-740450704',id=10,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:17:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='492e2572d9d24bf29c79f2b1c9dab462',ramdisk_id='',reservation_id='r-7v9fdigj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,reader,member',clean_attempts='1',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-625591926',owner_user_name='tempest-TestExecuteBasicStrategy-625591926-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:18:32Z,user_data=None,user_id='81221121fff040f195867ab25b59f26e',uuid=32736a79-cfc3-48ac-a561-ab16cd576f9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "address": "fa:16:3e:a5:4c:c5", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf479a66-03", "ovs_interfaceid": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.448 187216 DEBUG nova.network.os_vif_util [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Converting VIF {"id": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "address": "fa:16:3e:a5:4c:c5", "network": {"id": "d59a13e0-0444-4801-a38d-c9b9b692bc71", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1410671736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8adb6455c6724d8fa5a19dd6d4d677ea", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf479a66-03", "ovs_interfaceid": "bf479a66-0387-4d8d-81f9-8ad5c43f7122", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.449 187216 DEBUG nova.network.os_vif_util [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:4c:c5,bridge_name='br-int',has_traffic_filtering=True,id=bf479a66-0387-4d8d-81f9-8ad5c43f7122,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf479a66-03') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.450 187216 DEBUG os_vif [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:4c:c5,bridge_name='br-int',has_traffic_filtering=True,id=bf479a66-0387-4d8d-81f9-8ad5c43f7122,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf479a66-03') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.452 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.453 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf479a66-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.454 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.458 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.459 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.459 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c7073440-a0ff-47bc-a92d-538618e4cf91) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.461 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.462 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.465 187216 INFO os_vif [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:4c:c5,bridge_name='br-int',has_traffic_filtering=True,id=bf479a66-0387-4d8d-81f9-8ad5c43f7122,network=Network(d59a13e0-0444-4801-a38d-c9b9b692bc71),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf479a66-03')
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.465 187216 INFO nova.virt.libvirt.driver [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Deleting instance files /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c_del
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.467 187216 INFO nova.virt.libvirt.driver [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Deletion of /var/lib/nova/instances/32736a79-cfc3-48ac-a561-ab16cd576f9c_del complete
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.985 187216 INFO nova.compute.manager [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Took 1.33 seconds to destroy the instance on the hypervisor.
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.986 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.986 187216 DEBUG nova.compute.manager [-] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.986 187216 DEBUG nova.network.neutron [-] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:18:47 compute-0 nova_compute[187212]: 2025-11-25 19:18:47.987 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:48 compute-0 nova_compute[187212]: 2025-11-25 19:18:48.339 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:18:48 compute-0 nova_compute[187212]: 2025-11-25 19:18:48.741 187216 DEBUG nova.compute.manager [req-7c3e8422-3059-4961-9e93-de7f05f333db req-4546513b-8126-4391-a45a-b49ab39ed523 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Received event network-vif-deleted-bf479a66-0387-4d8d-81f9-8ad5c43f7122 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:18:48 compute-0 nova_compute[187212]: 2025-11-25 19:18:48.741 187216 INFO nova.compute.manager [req-7c3e8422-3059-4961-9e93-de7f05f333db req-4546513b-8126-4391-a45a-b49ab39ed523 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Neutron deleted interface bf479a66-0387-4d8d-81f9-8ad5c43f7122; detaching it from the instance and deleting it from the info cache
Nov 25 19:18:48 compute-0 nova_compute[187212]: 2025-11-25 19:18:48.741 187216 DEBUG nova.network.neutron [req-7c3e8422-3059-4961-9e93-de7f05f333db req-4546513b-8126-4391-a45a-b49ab39ed523 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:18:49 compute-0 nova_compute[187212]: 2025-11-25 19:18:49.005 187216 DEBUG nova.compute.manager [req-16cc48d5-5f57-4639-9ee0-bc382cb764ad req-438aa59c-bec0-4eec-974b-f798296c4a04 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Received event network-vif-unplugged-bf479a66-0387-4d8d-81f9-8ad5c43f7122 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:18:49 compute-0 nova_compute[187212]: 2025-11-25 19:18:49.006 187216 DEBUG oslo_concurrency.lockutils [req-16cc48d5-5f57-4639-9ee0-bc382cb764ad req-438aa59c-bec0-4eec-974b-f798296c4a04 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "32736a79-cfc3-48ac-a561-ab16cd576f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:49 compute-0 nova_compute[187212]: 2025-11-25 19:18:49.006 187216 DEBUG oslo_concurrency.lockutils [req-16cc48d5-5f57-4639-9ee0-bc382cb764ad req-438aa59c-bec0-4eec-974b-f798296c4a04 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "32736a79-cfc3-48ac-a561-ab16cd576f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:49 compute-0 nova_compute[187212]: 2025-11-25 19:18:49.006 187216 DEBUG oslo_concurrency.lockutils [req-16cc48d5-5f57-4639-9ee0-bc382cb764ad req-438aa59c-bec0-4eec-974b-f798296c4a04 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "32736a79-cfc3-48ac-a561-ab16cd576f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:49 compute-0 nova_compute[187212]: 2025-11-25 19:18:49.006 187216 DEBUG nova.compute.manager [req-16cc48d5-5f57-4639-9ee0-bc382cb764ad req-438aa59c-bec0-4eec-974b-f798296c4a04 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] No waiting events found dispatching network-vif-unplugged-bf479a66-0387-4d8d-81f9-8ad5c43f7122 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:18:49 compute-0 nova_compute[187212]: 2025-11-25 19:18:49.006 187216 DEBUG nova.compute.manager [req-16cc48d5-5f57-4639-9ee0-bc382cb764ad req-438aa59c-bec0-4eec-974b-f798296c4a04 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Received event network-vif-unplugged-bf479a66-0387-4d8d-81f9-8ad5c43f7122 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:18:49 compute-0 nova_compute[187212]: 2025-11-25 19:18:49.173 187216 DEBUG nova.network.neutron [-] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:18:49 compute-0 nova_compute[187212]: 2025-11-25 19:18:49.248 187216 DEBUG nova.compute.manager [req-7c3e8422-3059-4961-9e93-de7f05f333db req-4546513b-8126-4391-a45a-b49ab39ed523 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Detach interface failed, port_id=bf479a66-0387-4d8d-81f9-8ad5c43f7122, reason: Instance 32736a79-cfc3-48ac-a561-ab16cd576f9c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:18:49 compute-0 nova_compute[187212]: 2025-11-25 19:18:49.681 187216 INFO nova.compute.manager [-] [instance: 32736a79-cfc3-48ac-a561-ab16cd576f9c] Took 1.69 seconds to deallocate network for instance.
Nov 25 19:18:50 compute-0 podman[213633]: 2025-11-25 19:18:50.17549673 +0000 UTC m=+0.093318050 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:18:50 compute-0 nova_compute[187212]: 2025-11-25 19:18:50.203 187216 DEBUG oslo_concurrency.lockutils [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:18:50 compute-0 nova_compute[187212]: 2025-11-25 19:18:50.204 187216 DEBUG oslo_concurrency.lockutils [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:18:50 compute-0 nova_compute[187212]: 2025-11-25 19:18:50.219 187216 DEBUG oslo_concurrency.lockutils [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.015s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:50 compute-0 nova_compute[187212]: 2025-11-25 19:18:50.257 187216 INFO nova.scheduler.client.report [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Deleted allocations for instance 32736a79-cfc3-48ac-a561-ab16cd576f9c
Nov 25 19:18:51 compute-0 nova_compute[187212]: 2025-11-25 19:18:51.291 187216 DEBUG oslo_concurrency.lockutils [None req-c04058c3-11e8-4ecc-9ea6-f90913b28434 81221121fff040f195867ab25b59f26e 492e2572d9d24bf29c79f2b1c9dab462 - - default default] Lock "32736a79-cfc3-48ac-a561-ab16cd576f9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.171s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:18:52 compute-0 nova_compute[187212]: 2025-11-25 19:18:52.458 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:52 compute-0 nova_compute[187212]: 2025-11-25 19:18:52.461 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:57 compute-0 nova_compute[187212]: 2025-11-25 19:18:57.460 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:57 compute-0 nova_compute[187212]: 2025-11-25 19:18:57.463 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:58 compute-0 nova_compute[187212]: 2025-11-25 19:18:58.464 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:18:59 compute-0 podman[197585]: time="2025-11-25T19:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:18:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:18:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2619 "" "Go-http-client/1.1"
Nov 25 19:19:01 compute-0 podman[213654]: 2025-11-25 19:19:01.164588417 +0000 UTC m=+0.081458236 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:19:01 compute-0 openstack_network_exporter[199731]: ERROR   19:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:19:01 compute-0 openstack_network_exporter[199731]: ERROR   19:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:19:01 compute-0 openstack_network_exporter[199731]: ERROR   19:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:19:01 compute-0 openstack_network_exporter[199731]: ERROR   19:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:19:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:19:01 compute-0 openstack_network_exporter[199731]: ERROR   19:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:19:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:19:02 compute-0 nova_compute[187212]: 2025-11-25 19:19:02.463 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:07 compute-0 nova_compute[187212]: 2025-11-25 19:19:07.465 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:08 compute-0 podman[213679]: 2025-11-25 19:19:08.195759475 +0000 UTC m=+0.120824419 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Nov 25 19:19:10 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:10.657 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:cb:95 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-11cbd150-3663-4e5b-accf-41b7c6342172', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11cbd150-3663-4e5b-accf-41b7c6342172', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b772bc7ed0495a9623e22117005db4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e4f439c-31aa-44a2-80c1-9a08ca2ad23a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2200f7d8-8f50-478c-bbf9-061a0e683467) old=Port_Binding(mac=['fa:16:3e:68:cb:95'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-11cbd150-3663-4e5b-accf-41b7c6342172', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11cbd150-3663-4e5b-accf-41b7c6342172', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92b772bc7ed0495a9623e22117005db4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:19:10 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:10.658 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2200f7d8-8f50-478c-bbf9-061a0e683467 in datapath 11cbd150-3663-4e5b-accf-41b7c6342172 updated
Nov 25 19:19:10 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:10.660 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 11cbd150-3663-4e5b-accf-41b7c6342172, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:19:10 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:10.661 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[51b2a1ec-229c-45d4-9f9d-40def81c89a2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:19:11 compute-0 podman[213705]: 2025-11-25 19:19:11.16014079 +0000 UTC m=+0.078781212 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Nov 25 19:19:12 compute-0 nova_compute[187212]: 2025-11-25 19:19:12.467 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:17 compute-0 podman[213724]: 2025-11-25 19:19:17.17265906 +0000 UTC m=+0.093330330 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41)
Nov 25 19:19:17 compute-0 nova_compute[187212]: 2025-11-25 19:19:17.468 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:19:18 compute-0 nova_compute[187212]: 2025-11-25 19:19:18.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:18 compute-0 nova_compute[187212]: 2025-11-25 19:19:18.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:19:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:18.670 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:19:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:18.671 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:19:18 compute-0 nova_compute[187212]: 2025-11-25 19:19:18.671 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:20.757 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:02:ea 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0750c3ba-27c8-45b9-9892-bbb639287eb5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0750c3ba-27c8-45b9-9892-bbb639287eb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22227521978446c79fa022517841e5ff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27286ac3-c31d-4f06-80e2-b15316739cf2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ff320a93-a903-4277-a108-158a588ecb31) old=Port_Binding(mac=['fa:16:3e:5a:02:ea'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0750c3ba-27c8-45b9-9892-bbb639287eb5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0750c3ba-27c8-45b9-9892-bbb639287eb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22227521978446c79fa022517841e5ff', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:19:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:20.759 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ff320a93-a903-4277-a108-158a588ecb31 in datapath 0750c3ba-27c8-45b9-9892-bbb639287eb5 updated
Nov 25 19:19:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:20.760 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0750c3ba-27c8-45b9-9892-bbb639287eb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:19:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:20.761 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a18bd4-1eab-421d-aec5-e9c632786986]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:19:21 compute-0 podman[213746]: 2025-11-25 19:19:21.164868835 +0000 UTC m=+0.088187378 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Nov 25 19:19:22 compute-0 nova_compute[187212]: 2025-11-25 19:19:22.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:22 compute-0 nova_compute[187212]: 2025-11-25 19:19:22.470 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.695 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.695 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.940 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.943 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.976 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.978 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5813MB free_disk=72.99673843383789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.978 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:19:24 compute-0 nova_compute[187212]: 2025-11-25 19:19:24.979 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:19:25 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:25.673 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:19:26 compute-0 nova_compute[187212]: 2025-11-25 19:19:26.084 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:19:26 compute-0 nova_compute[187212]: 2025-11-25 19:19:26.085 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:19:24 up  1:11,  0 user,  load average: 0.22, 0.44, 0.48\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:19:26 compute-0 nova_compute[187212]: 2025-11-25 19:19:26.108 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:19:26 compute-0 nova_compute[187212]: 2025-11-25 19:19:26.623 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:19:27 compute-0 nova_compute[187212]: 2025-11-25 19:19:27.137 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:19:27 compute-0 nova_compute[187212]: 2025-11-25 19:19:27.138 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.158s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:19:27 compute-0 nova_compute[187212]: 2025-11-25 19:19:27.472 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:19:28 compute-0 nova_compute[187212]: 2025-11-25 19:19:28.139 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:28 compute-0 nova_compute[187212]: 2025-11-25 19:19:28.139 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:28 compute-0 nova_compute[187212]: 2025-11-25 19:19:28.139 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:29 compute-0 podman[197585]: time="2025-11-25T19:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:19:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:19:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2621 "" "Go-http-client/1.1"
Nov 25 19:19:30 compute-0 nova_compute[187212]: 2025-11-25 19:19:30.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:31.092 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:19:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:31.093 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:19:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:31.093 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:19:31 compute-0 openstack_network_exporter[199731]: ERROR   19:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:19:31 compute-0 openstack_network_exporter[199731]: ERROR   19:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:19:31 compute-0 openstack_network_exporter[199731]: ERROR   19:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:19:31 compute-0 openstack_network_exporter[199731]: ERROR   19:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:19:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:19:31 compute-0 openstack_network_exporter[199731]: ERROR   19:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:19:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:19:31 compute-0 nova_compute[187212]: 2025-11-25 19:19:31.643 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:31 compute-0 nova_compute[187212]: 2025-11-25 19:19:31.644 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:19:32 compute-0 podman[213769]: 2025-11-25 19:19:32.117765429 +0000 UTC m=+0.045932274 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:19:32 compute-0 nova_compute[187212]: 2025-11-25 19:19:32.151 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:19:32 compute-0 nova_compute[187212]: 2025-11-25 19:19:32.475 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:32 compute-0 nova_compute[187212]: 2025-11-25 19:19:32.681 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:32 compute-0 nova_compute[187212]: 2025-11-25 19:19:32.681 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:32 compute-0 ovn_controller[95465]: 2025-11-25T19:19:32Z|00113|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 19:19:34 compute-0 nova_compute[187212]: 2025-11-25 19:19:34.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:37 compute-0 nova_compute[187212]: 2025-11-25 19:19:37.412 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:37 compute-0 nova_compute[187212]: 2025-11-25 19:19:37.478 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:19:38 compute-0 nova_compute[187212]: 2025-11-25 19:19:38.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:19:38 compute-0 nova_compute[187212]: 2025-11-25 19:19:38.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:19:39 compute-0 podman[213794]: 2025-11-25 19:19:39.223642632 +0000 UTC m=+0.136289559 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:19:42 compute-0 podman[213820]: 2025-11-25 19:19:42.157404893 +0000 UTC m=+0.082067505 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 19:19:42 compute-0 nova_compute[187212]: 2025-11-25 19:19:42.481 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:19:42 compute-0 nova_compute[187212]: 2025-11-25 19:19:42.482 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:42 compute-0 nova_compute[187212]: 2025-11-25 19:19:42.482 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:19:42 compute-0 nova_compute[187212]: 2025-11-25 19:19:42.482 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:19:42 compute-0 nova_compute[187212]: 2025-11-25 19:19:42.483 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:19:42 compute-0 nova_compute[187212]: 2025-11-25 19:19:42.484 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:47 compute-0 nova_compute[187212]: 2025-11-25 19:19:47.484 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:48 compute-0 podman[213840]: 2025-11-25 19:19:48.158457617 +0000 UTC m=+0.077459592 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350)
Nov 25 19:19:52 compute-0 podman[213862]: 2025-11-25 19:19:52.171318491 +0000 UTC m=+0.087575852 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:19:52 compute-0 nova_compute[187212]: 2025-11-25 19:19:52.485 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:19:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:55.989 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:23:99 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cff475c9ba4a43f881c2be6a94ae0ff9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa6a1d8f-d84f-49e7-84e1-a927297c44e2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9941ceeb-16f5-4a0e-8227-c1de720c5499) old=Port_Binding(mac=['fa:16:3e:98:23:99'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cff475c9ba4a43f881c2be6a94ae0ff9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:19:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:55.991 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9941ceeb-16f5-4a0e-8227-c1de720c5499 in datapath 4c041141-ab86-4697-993b-67edbc4f2488 updated
Nov 25 19:19:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:55.992 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c041141-ab86-4697-993b-67edbc4f2488, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:19:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:19:55.993 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[588caf09-59de-4d93-bacd-0cdee40faff2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:19:57 compute-0 nova_compute[187212]: 2025-11-25 19:19:57.488 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:19:57 compute-0 nova_compute[187212]: 2025-11-25 19:19:57.490 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:19:57 compute-0 nova_compute[187212]: 2025-11-25 19:19:57.490 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:19:57 compute-0 nova_compute[187212]: 2025-11-25 19:19:57.490 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:19:57 compute-0 nova_compute[187212]: 2025-11-25 19:19:57.514 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:19:57 compute-0 nova_compute[187212]: 2025-11-25 19:19:57.515 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:19:59 compute-0 podman[197585]: time="2025-11-25T19:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:19:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:19:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2621 "" "Go-http-client/1.1"
Nov 25 19:20:01 compute-0 openstack_network_exporter[199731]: ERROR   19:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:20:01 compute-0 openstack_network_exporter[199731]: ERROR   19:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:20:01 compute-0 openstack_network_exporter[199731]: ERROR   19:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:20:01 compute-0 openstack_network_exporter[199731]: ERROR   19:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:20:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:20:01 compute-0 openstack_network_exporter[199731]: ERROR   19:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:20:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:20:02 compute-0 nova_compute[187212]: 2025-11-25 19:20:02.516 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:02 compute-0 nova_compute[187212]: 2025-11-25 19:20:02.518 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:02 compute-0 nova_compute[187212]: 2025-11-25 19:20:02.518 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:20:02 compute-0 nova_compute[187212]: 2025-11-25 19:20:02.518 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:02 compute-0 nova_compute[187212]: 2025-11-25 19:20:02.546 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:02 compute-0 nova_compute[187212]: 2025-11-25 19:20:02.547 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:03 compute-0 podman[213880]: 2025-11-25 19:20:03.152894673 +0000 UTC m=+0.077042609 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:20:07 compute-0 nova_compute[187212]: 2025-11-25 19:20:07.548 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:10 compute-0 podman[213906]: 2025-11-25 19:20:10.22129473 +0000 UTC m=+0.133303447 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 25 19:20:11 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:11.483 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:cf:c0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e497ac72-cf0f-4254-835f-6148d6829c01', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e497ac72-cf0f-4254-835f-6148d6829c01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14749d3c-e194-432b-80e0-9f3c7123947c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a74073d8-5653-4a04-bceb-7582651fd41c) old=Port_Binding(mac=['fa:16:3e:97:cf:c0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e497ac72-cf0f-4254-835f-6148d6829c01', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e497ac72-cf0f-4254-835f-6148d6829c01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:20:11 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:11.485 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a74073d8-5653-4a04-bceb-7582651fd41c in datapath e497ac72-cf0f-4254-835f-6148d6829c01 updated
Nov 25 19:20:11 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:11.486 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e497ac72-cf0f-4254-835f-6148d6829c01, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:20:11 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:11.487 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[56c93607-1f5e-4ec0-bfcd-bef427e1f61f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:20:12 compute-0 nova_compute[187212]: 2025-11-25 19:20:12.549 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:12 compute-0 nova_compute[187212]: 2025-11-25 19:20:12.550 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:12 compute-0 nova_compute[187212]: 2025-11-25 19:20:12.551 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:20:12 compute-0 nova_compute[187212]: 2025-11-25 19:20:12.551 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:12 compute-0 nova_compute[187212]: 2025-11-25 19:20:12.551 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:12 compute-0 nova_compute[187212]: 2025-11-25 19:20:12.552 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:13 compute-0 podman[213933]: 2025-11-25 19:20:13.161848585 +0000 UTC m=+0.075358305 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:20:17 compute-0 nova_compute[187212]: 2025-11-25 19:20:17.554 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:17 compute-0 nova_compute[187212]: 2025-11-25 19:20:17.556 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:17 compute-0 nova_compute[187212]: 2025-11-25 19:20:17.556 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:20:17 compute-0 nova_compute[187212]: 2025-11-25 19:20:17.556 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:17 compute-0 nova_compute[187212]: 2025-11-25 19:20:17.572 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:17 compute-0 nova_compute[187212]: 2025-11-25 19:20:17.572 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:18 compute-0 nova_compute[187212]: 2025-11-25 19:20:18.681 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:20:18 compute-0 nova_compute[187212]: 2025-11-25 19:20:18.681 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:20:19 compute-0 podman[213953]: 2025-11-25 19:20:19.204353953 +0000 UTC m=+0.124402192 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Nov 25 19:20:21 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 25 19:20:22 compute-0 nova_compute[187212]: 2025-11-25 19:20:22.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:20:22 compute-0 nova_compute[187212]: 2025-11-25 19:20:22.573 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:22 compute-0 nova_compute[187212]: 2025-11-25 19:20:22.575 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:22 compute-0 nova_compute[187212]: 2025-11-25 19:20:22.575 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:20:22 compute-0 nova_compute[187212]: 2025-11-25 19:20:22.576 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:22 compute-0 nova_compute[187212]: 2025-11-25 19:20:22.618 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:22 compute-0 nova_compute[187212]: 2025-11-25 19:20:22.619 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:23 compute-0 podman[213976]: 2025-11-25 19:20:23.194194764 +0000 UTC m=+0.113953575 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.692 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.928 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.930 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.962 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.963 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5829MB free_disk=72.99286270141602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.964 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:20:24 compute-0 nova_compute[187212]: 2025-11-25 19:20:24.964 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:20:26 compute-0 nova_compute[187212]: 2025-11-25 19:20:26.120 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:20:26 compute-0 nova_compute[187212]: 2025-11-25 19:20:26.120 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:20:24 up  1:12,  0 user,  load average: 0.41, 0.45, 0.48\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:20:26 compute-0 nova_compute[187212]: 2025-11-25 19:20:26.190 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:20:26 compute-0 nova_compute[187212]: 2025-11-25 19:20:26.258 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:20:26 compute-0 nova_compute[187212]: 2025-11-25 19:20:26.258 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:20:26 compute-0 nova_compute[187212]: 2025-11-25 19:20:26.282 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:20:26 compute-0 nova_compute[187212]: 2025-11-25 19:20:26.299 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:20:26 compute-0 nova_compute[187212]: 2025-11-25 19:20:26.324 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:20:26 compute-0 nova_compute[187212]: 2025-11-25 19:20:26.831 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:20:27 compute-0 nova_compute[187212]: 2025-11-25 19:20:27.345 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:20:27 compute-0 nova_compute[187212]: 2025-11-25 19:20:27.346 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.381s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:20:27 compute-0 nova_compute[187212]: 2025-11-25 19:20:27.620 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:28.466 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:20:28 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:28.466 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:20:28 compute-0 nova_compute[187212]: 2025-11-25 19:20:28.467 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:29 compute-0 nova_compute[187212]: 2025-11-25 19:20:29.346 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:20:29 compute-0 nova_compute[187212]: 2025-11-25 19:20:29.346 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:20:29 compute-0 nova_compute[187212]: 2025-11-25 19:20:29.347 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:20:29 compute-0 podman[197585]: time="2025-11-25T19:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:20:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:20:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2616 "" "Go-http-client/1.1"
Nov 25 19:20:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:31.094 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:20:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:31.094 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:20:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:31.095 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:20:31 compute-0 openstack_network_exporter[199731]: ERROR   19:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:20:31 compute-0 openstack_network_exporter[199731]: ERROR   19:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:20:31 compute-0 openstack_network_exporter[199731]: ERROR   19:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:20:31 compute-0 openstack_network_exporter[199731]: ERROR   19:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:20:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:20:31 compute-0 openstack_network_exporter[199731]: ERROR   19:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:20:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:20:32 compute-0 nova_compute[187212]: 2025-11-25 19:20:32.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:20:32 compute-0 nova_compute[187212]: 2025-11-25 19:20:32.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:20:32 compute-0 nova_compute[187212]: 2025-11-25 19:20:32.620 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:32 compute-0 nova_compute[187212]: 2025-11-25 19:20:32.622 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:34 compute-0 podman[214000]: 2025-11-25 19:20:34.159886663 +0000 UTC m=+0.072896179 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:20:37 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:20:37.468 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:20:37 compute-0 nova_compute[187212]: 2025-11-25 19:20:37.623 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:41 compute-0 podman[214024]: 2025-11-25 19:20:41.260243275 +0000 UTC m=+0.181606335 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:20:42 compute-0 nova_compute[187212]: 2025-11-25 19:20:42.624 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:42 compute-0 nova_compute[187212]: 2025-11-25 19:20:42.627 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:42 compute-0 nova_compute[187212]: 2025-11-25 19:20:42.627 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:20:42 compute-0 nova_compute[187212]: 2025-11-25 19:20:42.627 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:42 compute-0 nova_compute[187212]: 2025-11-25 19:20:42.628 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:42 compute-0 nova_compute[187212]: 2025-11-25 19:20:42.630 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:44 compute-0 podman[214051]: 2025-11-25 19:20:44.158705776 +0000 UTC m=+0.075411967 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 19:20:46 compute-0 nova_compute[187212]: 2025-11-25 19:20:46.503 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:20:46 compute-0 nova_compute[187212]: 2025-11-25 19:20:46.503 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:20:47 compute-0 nova_compute[187212]: 2025-11-25 19:20:47.009 187216 DEBUG nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:20:47 compute-0 nova_compute[187212]: 2025-11-25 19:20:47.576 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:20:47 compute-0 nova_compute[187212]: 2025-11-25 19:20:47.577 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:20:47 compute-0 nova_compute[187212]: 2025-11-25 19:20:47.588 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:20:47 compute-0 nova_compute[187212]: 2025-11-25 19:20:47.589 187216 INFO nova.compute.claims [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:20:47 compute-0 nova_compute[187212]: 2025-11-25 19:20:47.629 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:48 compute-0 nova_compute[187212]: 2025-11-25 19:20:48.651 187216 DEBUG nova.compute.provider_tree [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:20:49 compute-0 nova_compute[187212]: 2025-11-25 19:20:49.158 187216 DEBUG nova.scheduler.client.report [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:20:49 compute-0 nova_compute[187212]: 2025-11-25 19:20:49.679 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:20:49 compute-0 nova_compute[187212]: 2025-11-25 19:20:49.680 187216 DEBUG nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:20:50 compute-0 podman[214070]: 2025-11-25 19:20:50.165788629 +0000 UTC m=+0.086214271 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal)
Nov 25 19:20:50 compute-0 nova_compute[187212]: 2025-11-25 19:20:50.193 187216 DEBUG nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:20:50 compute-0 nova_compute[187212]: 2025-11-25 19:20:50.194 187216 DEBUG nova.network.neutron [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:20:50 compute-0 nova_compute[187212]: 2025-11-25 19:20:50.194 187216 WARNING neutronclient.v2_0.client [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:20:50 compute-0 nova_compute[187212]: 2025-11-25 19:20:50.195 187216 WARNING neutronclient.v2_0.client [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:20:50 compute-0 nova_compute[187212]: 2025-11-25 19:20:50.703 187216 INFO nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:20:51 compute-0 nova_compute[187212]: 2025-11-25 19:20:51.110 187216 DEBUG nova.network.neutron [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Successfully created port: 059368de-9300-4ad9-a662-f1bd38b00db6 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:20:51 compute-0 nova_compute[187212]: 2025-11-25 19:20:51.213 187216 DEBUG nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.232 187216 DEBUG nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.234 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.235 187216 INFO nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Creating image(s)
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.236 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "/var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.236 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "/var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.238 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "/var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.239 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.246 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.248 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.331 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.333 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.335 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.336 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.342 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.343 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.411 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.413 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.471 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.472 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.473 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.558 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.560 187216 DEBUG nova.virt.disk.api [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Checking if we can resize image /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.560 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.632 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.634 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.635 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.635 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.643 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.645 187216 DEBUG nova.virt.disk.api [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Cannot resize image /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.646 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.646 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Ensure instance console log exists: /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.647 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.648 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.648 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.665 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:52 compute-0 nova_compute[187212]: 2025-11-25 19:20:52.666 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:54 compute-0 podman[214106]: 2025-11-25 19:20:54.167764002 +0000 UTC m=+0.087388093 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:20:54 compute-0 nova_compute[187212]: 2025-11-25 19:20:54.812 187216 DEBUG nova.network.neutron [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Successfully updated port: 059368de-9300-4ad9-a662-f1bd38b00db6 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:20:54 compute-0 nova_compute[187212]: 2025-11-25 19:20:54.887 187216 DEBUG nova.compute.manager [req-c0ee234b-0b8f-46ae-aa0b-62c06de86652 req-4df0cd32-0fda-4476-bf48-c3b7a0650ee8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Received event network-changed-059368de-9300-4ad9-a662-f1bd38b00db6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:20:54 compute-0 nova_compute[187212]: 2025-11-25 19:20:54.887 187216 DEBUG nova.compute.manager [req-c0ee234b-0b8f-46ae-aa0b-62c06de86652 req-4df0cd32-0fda-4476-bf48-c3b7a0650ee8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Refreshing instance network info cache due to event network-changed-059368de-9300-4ad9-a662-f1bd38b00db6. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:20:54 compute-0 nova_compute[187212]: 2025-11-25 19:20:54.888 187216 DEBUG oslo_concurrency.lockutils [req-c0ee234b-0b8f-46ae-aa0b-62c06de86652 req-4df0cd32-0fda-4476-bf48-c3b7a0650ee8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:20:54 compute-0 nova_compute[187212]: 2025-11-25 19:20:54.888 187216 DEBUG oslo_concurrency.lockutils [req-c0ee234b-0b8f-46ae-aa0b-62c06de86652 req-4df0cd32-0fda-4476-bf48-c3b7a0650ee8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:20:54 compute-0 nova_compute[187212]: 2025-11-25 19:20:54.888 187216 DEBUG nova.network.neutron [req-c0ee234b-0b8f-46ae-aa0b-62c06de86652 req-4df0cd32-0fda-4476-bf48-c3b7a0650ee8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Refreshing network info cache for port 059368de-9300-4ad9-a662-f1bd38b00db6 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:20:55 compute-0 nova_compute[187212]: 2025-11-25 19:20:55.320 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "refresh_cache-4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:20:55 compute-0 nova_compute[187212]: 2025-11-25 19:20:55.396 187216 WARNING neutronclient.v2_0.client [req-c0ee234b-0b8f-46ae-aa0b-62c06de86652 req-4df0cd32-0fda-4476-bf48-c3b7a0650ee8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:20:55 compute-0 nova_compute[187212]: 2025-11-25 19:20:55.489 187216 DEBUG nova.network.neutron [req-c0ee234b-0b8f-46ae-aa0b-62c06de86652 req-4df0cd32-0fda-4476-bf48-c3b7a0650ee8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:20:55 compute-0 nova_compute[187212]: 2025-11-25 19:20:55.979 187216 DEBUG nova.network.neutron [req-c0ee234b-0b8f-46ae-aa0b-62c06de86652 req-4df0cd32-0fda-4476-bf48-c3b7a0650ee8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:20:56 compute-0 nova_compute[187212]: 2025-11-25 19:20:56.486 187216 DEBUG oslo_concurrency.lockutils [req-c0ee234b-0b8f-46ae-aa0b-62c06de86652 req-4df0cd32-0fda-4476-bf48-c3b7a0650ee8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:20:56 compute-0 nova_compute[187212]: 2025-11-25 19:20:56.487 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquired lock "refresh_cache-4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:20:56 compute-0 nova_compute[187212]: 2025-11-25 19:20:56.488 187216 DEBUG nova.network.neutron [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:20:57 compute-0 nova_compute[187212]: 2025-11-25 19:20:57.444 187216 DEBUG nova.network.neutron [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:20:57 compute-0 nova_compute[187212]: 2025-11-25 19:20:57.666 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:57 compute-0 nova_compute[187212]: 2025-11-25 19:20:57.669 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:57 compute-0 nova_compute[187212]: 2025-11-25 19:20:57.669 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:20:57 compute-0 nova_compute[187212]: 2025-11-25 19:20:57.670 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:57 compute-0 nova_compute[187212]: 2025-11-25 19:20:57.671 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:20:57 compute-0 nova_compute[187212]: 2025-11-25 19:20:57.673 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:58 compute-0 nova_compute[187212]: 2025-11-25 19:20:58.429 187216 WARNING neutronclient.v2_0.client [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:20:58 compute-0 nova_compute[187212]: 2025-11-25 19:20:58.715 187216 DEBUG nova.network.neutron [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Updating instance_info_cache with network_info: [{"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.224 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Releasing lock "refresh_cache-4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.225 187216 DEBUG nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Instance network_info: |[{"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.230 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Start _get_guest_xml network_info=[{"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.235 187216 WARNING nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.237 187216 DEBUG nova.virt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-426547155', uuid='4ab49376-6ee4-40fa-b6ed-b289cfc5c43c'), owner=OwnerMeta(userid='b86907256ac0401183dd8a2c5394fe31', username='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin', projectid='01a0280ccebb48a888956426fb3d2015', projectname='tempest-TestExecuteHostMaintenanceStrategy-1349736763'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764098459.237748) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.246 187216 DEBUG nova.virt.libvirt.host [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.247 187216 DEBUG nova.virt.libvirt.host [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.253 187216 DEBUG nova.virt.libvirt.host [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.254 187216 DEBUG nova.virt.libvirt.host [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.256 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.256 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.257 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.257 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.258 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.258 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.258 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.258 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.259 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.259 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.259 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.260 187216 DEBUG nova.virt.hardware [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.266 187216 DEBUG nova.virt.libvirt.vif [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:20:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-426547155',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-426547155',id=13,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-mfla5az4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:20:51Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=4ab49376-6ee4-40fa-b6ed-b289cfc5c43c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.266 187216 DEBUG nova.network.os_vif_util [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converting VIF {"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.267 187216 DEBUG nova.network.os_vif_util [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b5:dd,bridge_name='br-int',has_traffic_filtering=True,id=059368de-9300-4ad9-a662-f1bd38b00db6,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap059368de-93') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.268 187216 DEBUG nova.objects.instance [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:20:59 compute-0 podman[197585]: time="2025-11-25T19:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:20:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:20:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2621 "" "Go-http-client/1.1"
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.781 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <uuid>4ab49376-6ee4-40fa-b6ed-b289cfc5c43c</uuid>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <name>instance-0000000d</name>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-426547155</nova:name>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:20:59</nova:creationTime>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:20:59 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:20:59 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:user uuid="b86907256ac0401183dd8a2c5394fe31">tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin</nova:user>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:project uuid="01a0280ccebb48a888956426fb3d2015">tempest-TestExecuteHostMaintenanceStrategy-1349736763</nova:project>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         <nova:port uuid="059368de-9300-4ad9-a662-f1bd38b00db6">
Nov 25 19:20:59 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <system>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <entry name="serial">4ab49376-6ee4-40fa-b6ed-b289cfc5c43c</entry>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <entry name="uuid">4ab49376-6ee4-40fa-b6ed-b289cfc5c43c</entry>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     </system>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <os>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   </os>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <features>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   </features>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk.config"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:9f:b5:dd"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <target dev="tap059368de-93"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/console.log" append="off"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <video>
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     </video>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:20:59 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:20:59 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:20:59 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:20:59 compute-0 nova_compute[187212]: </domain>
Nov 25 19:20:59 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.781 187216 DEBUG nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Preparing to wait for external event network-vif-plugged-059368de-9300-4ad9-a662-f1bd38b00db6 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.782 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.782 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.782 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.783 187216 DEBUG nova.virt.libvirt.vif [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:20:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-426547155',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-426547155',id=13,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-mfla5az4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:20:51Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=4ab49376-6ee4-40fa-b6ed-b289cfc5c43c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.784 187216 DEBUG nova.network.os_vif_util [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converting VIF {"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.784 187216 DEBUG nova.network.os_vif_util [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b5:dd,bridge_name='br-int',has_traffic_filtering=True,id=059368de-9300-4ad9-a662-f1bd38b00db6,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap059368de-93') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.785 187216 DEBUG os_vif [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b5:dd,bridge_name='br-int',has_traffic_filtering=True,id=059368de-9300-4ad9-a662-f1bd38b00db6,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap059368de-93') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.786 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.786 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.787 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.788 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.788 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '92fc7535-a1ef-5941-ac6d-1580f585433d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.827 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.830 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.833 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.833 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap059368de-93, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.834 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap059368de-93, col_values=(('qos', UUID('da53c874-8c71-41f3-a173-6bf3bde3a680')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.834 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap059368de-93, col_values=(('external_ids', {'iface-id': '059368de-9300-4ad9-a662-f1bd38b00db6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:b5:dd', 'vm-uuid': '4ab49376-6ee4-40fa-b6ed-b289cfc5c43c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.836 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:59 compute-0 NetworkManager[55552]: <info>  [1764098459.8380] manager: (tap059368de-93): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.839 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.844 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:20:59 compute-0 nova_compute[187212]: 2025-11-25 19:20:59.845 187216 INFO os_vif [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b5:dd,bridge_name='br-int',has_traffic_filtering=True,id=059368de-9300-4ad9-a662-f1bd38b00db6,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap059368de-93')
Nov 25 19:21:01 compute-0 nova_compute[187212]: 2025-11-25 19:21:01.401 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:21:01 compute-0 nova_compute[187212]: 2025-11-25 19:21:01.401 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:21:01 compute-0 nova_compute[187212]: 2025-11-25 19:21:01.402 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] No VIF found with MAC fa:16:3e:9f:b5:dd, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:21:01 compute-0 nova_compute[187212]: 2025-11-25 19:21:01.403 187216 INFO nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Using config drive
Nov 25 19:21:01 compute-0 openstack_network_exporter[199731]: ERROR   19:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:21:01 compute-0 openstack_network_exporter[199731]: ERROR   19:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:21:01 compute-0 openstack_network_exporter[199731]: ERROR   19:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:21:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:21:01 compute-0 openstack_network_exporter[199731]: ERROR   19:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:21:01 compute-0 openstack_network_exporter[199731]: ERROR   19:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:21:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:21:01 compute-0 nova_compute[187212]: 2025-11-25 19:21:01.918 187216 WARNING neutronclient.v2_0.client [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:02 compute-0 nova_compute[187212]: 2025-11-25 19:21:02.589 187216 INFO nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Creating config drive at /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk.config
Nov 25 19:21:02 compute-0 nova_compute[187212]: 2025-11-25 19:21:02.599 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpvrhbvn61 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:02 compute-0 nova_compute[187212]: 2025-11-25 19:21:02.670 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:02 compute-0 nova_compute[187212]: 2025-11-25 19:21:02.743 187216 DEBUG oslo_concurrency.processutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpvrhbvn61" returned: 0 in 0.144s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:02 compute-0 kernel: tap059368de-93: entered promiscuous mode
Nov 25 19:21:02 compute-0 NetworkManager[55552]: <info>  [1764098462.8378] manager: (tap059368de-93): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Nov 25 19:21:02 compute-0 ovn_controller[95465]: 2025-11-25T19:21:02Z|00114|binding|INFO|Claiming lport 059368de-9300-4ad9-a662-f1bd38b00db6 for this chassis.
Nov 25 19:21:02 compute-0 ovn_controller[95465]: 2025-11-25T19:21:02Z|00115|binding|INFO|059368de-9300-4ad9-a662-f1bd38b00db6: Claiming fa:16:3e:9f:b5:dd 10.100.0.10
Nov 25 19:21:02 compute-0 nova_compute[187212]: 2025-11-25 19:21:02.843 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:02 compute-0 nova_compute[187212]: 2025-11-25 19:21:02.848 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:02 compute-0 nova_compute[187212]: 2025-11-25 19:21:02.853 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.864 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:b5:dd 10.100.0.10'], port_security=['fa:16:3e:9f:b5:dd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4ab49376-6ee4-40fa-b6ed-b289cfc5c43c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cd093ae9-737a-4a69-9f47-f2a7c74a9952', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa6a1d8f-d84f-49e7-84e1-a927297c44e2, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=059368de-9300-4ad9-a662-f1bd38b00db6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.865 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 059368de-9300-4ad9-a662-f1bd38b00db6 in datapath 4c041141-ab86-4697-993b-67edbc4f2488 bound to our chassis
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.866 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:21:02 compute-0 systemd-udevd[214148]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.884 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[26823cd1-b8fa-495a-b536-52f20ce5eccf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.885 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4c041141-a1 in ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.887 208756 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4c041141-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.887 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[72b1f542-48b7-4d18-8b93-a7936d6480ec]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.888 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a7368d-bd87-475f-8238-fefc7b24cfda]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:02 compute-0 systemd-machined[153494]: New machine qemu-10-instance-0000000d.
Nov 25 19:21:02 compute-0 NetworkManager[55552]: <info>  [1764098462.9029] device (tap059368de-93): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:21:02 compute-0 NetworkManager[55552]: <info>  [1764098462.9042] device (tap059368de-93): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.906 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[a6654044-2e42-4486-b64e-feee78a4eba2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.928 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[fa04919c-50ea-484d-b162-25c14c542f25]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:02 compute-0 nova_compute[187212]: 2025-11-25 19:21:02.940 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:02 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000d.
Nov 25 19:21:02 compute-0 ovn_controller[95465]: 2025-11-25T19:21:02Z|00116|binding|INFO|Setting lport 059368de-9300-4ad9-a662-f1bd38b00db6 ovn-installed in OVS
Nov 25 19:21:02 compute-0 ovn_controller[95465]: 2025-11-25T19:21:02Z|00117|binding|INFO|Setting lport 059368de-9300-4ad9-a662-f1bd38b00db6 up in Southbound
Nov 25 19:21:02 compute-0 nova_compute[187212]: 2025-11-25 19:21:02.945 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.963 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[4428c1a5-620d-4793-956d-a25ae372e127]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:02.968 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[e714a05c-9791-4736-a2be-2b251de0c8c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:02 compute-0 NetworkManager[55552]: <info>  [1764098462.9703] manager: (tap4c041141-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.007 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[f445b84e-24ac-4953-b552-7f0ea6145b16]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.011 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[07c14c78-333e-4b80-ac61-fa33d045aab1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 NetworkManager[55552]: <info>  [1764098463.0422] device (tap4c041141-a0): carrier: link connected
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.051 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[8a69e26d-7c35-4be5-ba10-b9da4f21db6d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.075 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[b06ee07c-0355-4288-9ed4-81eba48577a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c041141-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:23:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441071, 'reachable_time': 20794, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214181, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.095 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[26c6d926-c7b4-4f33-a32f-7b3dc125f3eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe98:2399'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441071, 'tstamp': 441071}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214182, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.119 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[47ba82b3-92b5-4022-9fbf-86713da15ac6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c041141-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:23:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441071, 'reachable_time': 20794, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214183, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.157 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[f5acf3ca-168c-4fd7-a5ae-d55dbcfce617]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.240 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac86b2a-4d6d-4d6f-bccd-112a0013096e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.241 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c041141-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.241 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.242 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c041141-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:03 compute-0 kernel: tap4c041141-a0: entered promiscuous mode
Nov 25 19:21:03 compute-0 NetworkManager[55552]: <info>  [1764098463.2458] manager: (tap4c041141-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.244 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.254 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4c041141-a0, col_values=(('external_ids', {'iface-id': '9941ceeb-16f5-4a0e-8227-c1de720c5499'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.255 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:03 compute-0 ovn_controller[95465]: 2025-11-25T19:21:03Z|00118|binding|INFO|Releasing lport 9941ceeb-16f5-4a0e-8227-c1de720c5499 from this chassis (sb_readonly=0)
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.258 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[503001ed-6d1e-4081-8c11-84ec27ef224b]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.260 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.260 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.260 104356 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 4c041141-ab86-4697-993b-67edbc4f2488 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.260 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.261 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[af6a2ec7-551b-49cc-aac3-0ec9fad00115]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.262 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.262 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[a3a8ad60-37ce-4649-a49c-7d19290cb45e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.263 104356 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: global
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     log         /dev/log local0 debug
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     log-tag     haproxy-metadata-proxy-4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     user        root
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     group       root
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     maxconn     1024
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     pidfile     /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     daemon
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: defaults
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     log global
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     mode http
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     option httplog
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     option dontlognull
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     option http-server-close
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     option forwardfor
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     retries                 3
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     timeout http-request    30s
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     timeout connect         30s
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     timeout client          32s
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     timeout server          32s
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     timeout http-keep-alive 30s
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: listen listener
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     bind 169.254.169.254:80
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:     http-request add-header X-OVN-Network-ID 4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 19:21:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:03.264 104356 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'env', 'PROCESS_TAG=haproxy-4c041141-ab86-4697-993b-67edbc4f2488', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4c041141-ab86-4697-993b-67edbc4f2488.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.268 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.566 187216 DEBUG nova.compute.manager [req-b299ab94-30fc-41ad-bc0b-691643cc9963 req-0f473aff-236c-42a5-a880-810a6f572ec8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Received event network-vif-plugged-059368de-9300-4ad9-a662-f1bd38b00db6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.567 187216 DEBUG oslo_concurrency.lockutils [req-b299ab94-30fc-41ad-bc0b-691643cc9963 req-0f473aff-236c-42a5-a880-810a6f572ec8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.567 187216 DEBUG oslo_concurrency.lockutils [req-b299ab94-30fc-41ad-bc0b-691643cc9963 req-0f473aff-236c-42a5-a880-810a6f572ec8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.568 187216 DEBUG oslo_concurrency.lockutils [req-b299ab94-30fc-41ad-bc0b-691643cc9963 req-0f473aff-236c-42a5-a880-810a6f572ec8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.568 187216 DEBUG nova.compute.manager [req-b299ab94-30fc-41ad-bc0b-691643cc9963 req-0f473aff-236c-42a5-a880-810a6f572ec8 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Processing event network-vif-plugged-059368de-9300-4ad9-a662-f1bd38b00db6 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.569 187216 DEBUG nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.575 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.581 187216 INFO nova.virt.libvirt.driver [-] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Instance spawned successfully.
Nov 25 19:21:03 compute-0 nova_compute[187212]: 2025-11-25 19:21:03.582 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:21:03 compute-0 podman[214222]: 2025-11-25 19:21:03.751612017 +0000 UTC m=+0.077880781 container create b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Nov 25 19:21:03 compute-0 systemd[1]: Started libpod-conmon-b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d.scope.
Nov 25 19:21:03 compute-0 podman[214222]: 2025-11-25 19:21:03.712087271 +0000 UTC m=+0.038356095 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 19:21:03 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:21:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8019e5a190d873e9c0aae319220dde3f36d7c09476638025b4ed75b571168f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 19:21:03 compute-0 podman[214222]: 2025-11-25 19:21:03.866908297 +0000 UTC m=+0.193177101 container init b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 19:21:03 compute-0 podman[214222]: 2025-11-25 19:21:03.87422077 +0000 UTC m=+0.200489544 container start b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest)
Nov 25 19:21:03 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[214236]: [NOTICE]   (214240) : New worker (214242) forked
Nov 25 19:21:03 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[214236]: [NOTICE]   (214240) : Loading success.
Nov 25 19:21:04 compute-0 nova_compute[187212]: 2025-11-25 19:21:04.101 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:21:04 compute-0 nova_compute[187212]: 2025-11-25 19:21:04.102 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:21:04 compute-0 nova_compute[187212]: 2025-11-25 19:21:04.103 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:21:04 compute-0 nova_compute[187212]: 2025-11-25 19:21:04.104 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:21:04 compute-0 nova_compute[187212]: 2025-11-25 19:21:04.104 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:21:04 compute-0 nova_compute[187212]: 2025-11-25 19:21:04.105 187216 DEBUG nova.virt.libvirt.driver [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:21:04 compute-0 nova_compute[187212]: 2025-11-25 19:21:04.617 187216 INFO nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Took 12.38 seconds to spawn the instance on the hypervisor.
Nov 25 19:21:04 compute-0 nova_compute[187212]: 2025-11-25 19:21:04.618 187216 DEBUG nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:21:04 compute-0 nova_compute[187212]: 2025-11-25 19:21:04.837 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:05 compute-0 nova_compute[187212]: 2025-11-25 19:21:05.168 187216 INFO nova.compute.manager [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Took 17.65 seconds to build instance.
Nov 25 19:21:05 compute-0 podman[214251]: 2025-11-25 19:21:05.2075744 +0000 UTC m=+0.126119867 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:21:05 compute-0 nova_compute[187212]: 2025-11-25 19:21:05.659 187216 DEBUG nova.compute.manager [req-a813b2e0-bb8d-4d5c-a99e-5b5624632f9f req-84fa5700-f90e-4a99-a683-086605f20b94 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Received event network-vif-plugged-059368de-9300-4ad9-a662-f1bd38b00db6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:21:05 compute-0 nova_compute[187212]: 2025-11-25 19:21:05.660 187216 DEBUG oslo_concurrency.lockutils [req-a813b2e0-bb8d-4d5c-a99e-5b5624632f9f req-84fa5700-f90e-4a99-a683-086605f20b94 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:05 compute-0 nova_compute[187212]: 2025-11-25 19:21:05.661 187216 DEBUG oslo_concurrency.lockutils [req-a813b2e0-bb8d-4d5c-a99e-5b5624632f9f req-84fa5700-f90e-4a99-a683-086605f20b94 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:05 compute-0 nova_compute[187212]: 2025-11-25 19:21:05.662 187216 DEBUG oslo_concurrency.lockutils [req-a813b2e0-bb8d-4d5c-a99e-5b5624632f9f req-84fa5700-f90e-4a99-a683-086605f20b94 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:05 compute-0 nova_compute[187212]: 2025-11-25 19:21:05.663 187216 DEBUG nova.compute.manager [req-a813b2e0-bb8d-4d5c-a99e-5b5624632f9f req-84fa5700-f90e-4a99-a683-086605f20b94 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] No waiting events found dispatching network-vif-plugged-059368de-9300-4ad9-a662-f1bd38b00db6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:21:05 compute-0 nova_compute[187212]: 2025-11-25 19:21:05.663 187216 WARNING nova.compute.manager [req-a813b2e0-bb8d-4d5c-a99e-5b5624632f9f req-84fa5700-f90e-4a99-a683-086605f20b94 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Received unexpected event network-vif-plugged-059368de-9300-4ad9-a662-f1bd38b00db6 for instance with vm_state active and task_state None.
Nov 25 19:21:05 compute-0 nova_compute[187212]: 2025-11-25 19:21:05.675 187216 DEBUG oslo_concurrency.lockutils [None req-45791cb8-4620-46a9-b68f-768765b9a7b9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.171s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:07 compute-0 nova_compute[187212]: 2025-11-25 19:21:07.674 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:09 compute-0 nova_compute[187212]: 2025-11-25 19:21:09.840 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:12 compute-0 podman[214275]: 2025-11-25 19:21:12.231170231 +0000 UTC m=+0.150045559 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 19:21:12 compute-0 nova_compute[187212]: 2025-11-25 19:21:12.676 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:14 compute-0 nova_compute[187212]: 2025-11-25 19:21:14.844 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:15 compute-0 podman[214308]: 2025-11-25 19:21:15.155709873 +0000 UTC m=+0.074406969 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 25 19:21:15 compute-0 ovn_controller[95465]: 2025-11-25T19:21:15Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9f:b5:dd 10.100.0.10
Nov 25 19:21:15 compute-0 ovn_controller[95465]: 2025-11-25T19:21:15Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:b5:dd 10.100.0.10
Nov 25 19:21:17 compute-0 nova_compute[187212]: 2025-11-25 19:21:17.679 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:19 compute-0 nova_compute[187212]: 2025-11-25 19:21:19.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:21:19 compute-0 nova_compute[187212]: 2025-11-25 19:21:19.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:21:19 compute-0 nova_compute[187212]: 2025-11-25 19:21:19.847 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:21 compute-0 podman[214342]: 2025-11-25 19:21:21.182355101 +0000 UTC m=+0.098575758 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, config_id=edpm)
Nov 25 19:21:21 compute-0 nova_compute[187212]: 2025-11-25 19:21:21.621 187216 DEBUG nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Creating tmpfile /var/lib/nova/instances/tmpsksj4kj6 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Nov 25 19:21:21 compute-0 nova_compute[187212]: 2025-11-25 19:21:21.622 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:21 compute-0 nova_compute[187212]: 2025-11-25 19:21:21.639 187216 DEBUG nova.compute.manager [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsksj4kj6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Nov 25 19:21:22 compute-0 nova_compute[187212]: 2025-11-25 19:21:22.682 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:23 compute-0 nova_compute[187212]: 2025-11-25 19:21:23.695 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:24 compute-0 nova_compute[187212]: 2025-11-25 19:21:24.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:21:24 compute-0 nova_compute[187212]: 2025-11-25 19:21:24.850 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:25 compute-0 podman[214365]: 2025-11-25 19:21:25.164618311 +0000 UTC m=+0.087279810 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 25 19:21:25 compute-0 nova_compute[187212]: 2025-11-25 19:21:25.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:21:25 compute-0 nova_compute[187212]: 2025-11-25 19:21:25.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:25 compute-0 nova_compute[187212]: 2025-11-25 19:21:25.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:25 compute-0 nova_compute[187212]: 2025-11-25 19:21:25.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:25 compute-0 nova_compute[187212]: 2025-11-25 19:21:25.694 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:21:26 compute-0 nova_compute[187212]: 2025-11-25 19:21:26.741 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:26 compute-0 nova_compute[187212]: 2025-11-25 19:21:26.831 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:26 compute-0 nova_compute[187212]: 2025-11-25 19:21:26.833 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:26 compute-0 nova_compute[187212]: 2025-11-25 19:21:26.923 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:27 compute-0 nova_compute[187212]: 2025-11-25 19:21:27.174 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:21:27 compute-0 nova_compute[187212]: 2025-11-25 19:21:27.176 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:27 compute-0 nova_compute[187212]: 2025-11-25 19:21:27.200 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:27 compute-0 nova_compute[187212]: 2025-11-25 19:21:27.201 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5616MB free_disk=72.96366500854492GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:21:27 compute-0 nova_compute[187212]: 2025-11-25 19:21:27.201 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:27 compute-0 nova_compute[187212]: 2025-11-25 19:21:27.202 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:27 compute-0 nova_compute[187212]: 2025-11-25 19:21:27.685 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:28 compute-0 nova_compute[187212]: 2025-11-25 19:21:28.223 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Migration for instance f6098ca7-42a8-4720-be83-d8dded5070c2 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Nov 25 19:21:28 compute-0 nova_compute[187212]: 2025-11-25 19:21:28.262 187216 DEBUG nova.compute.manager [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsksj4kj6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6098ca7-42a8-4720-be83-d8dded5070c2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Nov 25 19:21:28 compute-0 nova_compute[187212]: 2025-11-25 19:21:28.731 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Updating resource usage from migration 18f8aba0-99f4-4c55-833c-4944c797b212
Nov 25 19:21:28 compute-0 nova_compute[187212]: 2025-11-25 19:21:28.732 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Starting to track incoming migration 18f8aba0-99f4-4c55-833c-4944c797b212 with flavor d7d5bae9-10ca-4750-9d69-ce73a869da56 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.277 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-f6098ca7-42a8-4720-be83-d8dded5070c2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.278 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-f6098ca7-42a8-4720-be83-d8dded5070c2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.278 187216 DEBUG nova.network.neutron [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.283 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:21:29 compute-0 podman[197585]: time="2025-11-25T19:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:21:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:21:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3085 "" "Go-http-client/1.1"
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.787 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.796 187216 WARNING nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance f6098ca7-42a8-4720-be83-d8dded5070c2 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.796 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.797 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:21:27 up  1:13,  0 user,  load average: 0.39, 0.43, 0.47\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_01a0280ccebb48a888956426fb3d2015': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.846 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:21:29 compute-0 nova_compute[187212]: 2025-11-25 19:21:29.853 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:31.096 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:31.097 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:31.098 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:31 compute-0 nova_compute[187212]: 2025-11-25 19:21:31.164 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:21:31 compute-0 openstack_network_exporter[199731]: ERROR   19:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:21:31 compute-0 openstack_network_exporter[199731]: ERROR   19:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:21:31 compute-0 openstack_network_exporter[199731]: ERROR   19:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:21:31 compute-0 openstack_network_exporter[199731]: ERROR   19:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:21:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:21:31 compute-0 openstack_network_exporter[199731]: ERROR   19:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:21:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:21:31 compute-0 nova_compute[187212]: 2025-11-25 19:21:31.677 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:21:31 compute-0 nova_compute[187212]: 2025-11-25 19:21:31.678 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.476s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:32 compute-0 nova_compute[187212]: 2025-11-25 19:21:32.437 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:32 compute-0 nova_compute[187212]: 2025-11-25 19:21:32.621 187216 DEBUG nova.network.neutron [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Updating instance_info_cache with network_info: [{"id": "38222a41-1966-455e-afa1-2bb851b0c328", "address": "fa:16:3e:6a:47:e2", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38222a41-19", "ovs_interfaceid": "38222a41-1966-455e-afa1-2bb851b0c328", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:21:32 compute-0 nova_compute[187212]: 2025-11-25 19:21:32.677 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:21:32 compute-0 nova_compute[187212]: 2025-11-25 19:21:32.678 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:21:32 compute-0 nova_compute[187212]: 2025-11-25 19:21:32.729 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.128 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-f6098ca7-42a8-4720-be83-d8dded5070c2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.144 187216 DEBUG nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsksj4kj6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6098ca7-42a8-4720-be83-d8dded5070c2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.145 187216 DEBUG nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Creating instance directory: /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.146 187216 DEBUG nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Creating disk.info with the contents: {'/var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk': 'qcow2', '/var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.146 187216 DEBUG nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.147 187216 DEBUG nova.objects.instance [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f6098ca7-42a8-4720-be83-d8dded5070c2 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.187 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.188 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.188 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.189 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:21:33 compute-0 ovn_controller[95465]: 2025-11-25T19:21:33Z|00119|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.655 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.661 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.663 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.759 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.765 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.766 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.767 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.774 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.775 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.862 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.864 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.923 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.925 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:33 compute-0 nova_compute[187212]: 2025-11-25 19:21:33.926 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.012 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.013 187216 DEBUG nova.virt.disk.api [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Checking if we can resize image /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.014 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.101 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.103 187216 DEBUG nova.virt.disk.api [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Cannot resize image /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.104 187216 DEBUG nova.objects.instance [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'migration_context' on Instance uuid f6098ca7-42a8-4720-be83-d8dded5070c2 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.616 187216 DEBUG nova.objects.base [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Object Instance<f6098ca7-42a8-4720-be83-d8dded5070c2> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.617 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.654 187216 DEBUG oslo_concurrency.processutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2/disk.config 497664" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.655 187216 DEBUG nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.656 187216 DEBUG nova.virt.libvirt.vif [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-11-25T19:20:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1516267269',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1516267269',id=12,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:20:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-0s0cry0q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:20:39Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=f6098ca7-42a8-4720-be83-d8dded5070c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "38222a41-1966-455e-afa1-2bb851b0c328", "address": "fa:16:3e:6a:47:e2", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap38222a41-19", "ovs_interfaceid": "38222a41-1966-455e-afa1-2bb851b0c328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.656 187216 DEBUG nova.network.os_vif_util [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "38222a41-1966-455e-afa1-2bb851b0c328", "address": "fa:16:3e:6a:47:e2", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap38222a41-19", "ovs_interfaceid": "38222a41-1966-455e-afa1-2bb851b0c328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.657 187216 DEBUG nova.network.os_vif_util [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:47:e2,bridge_name='br-int',has_traffic_filtering=True,id=38222a41-1966-455e-afa1-2bb851b0c328,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38222a41-19') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.657 187216 DEBUG os_vif [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:47:e2,bridge_name='br-int',has_traffic_filtering=True,id=38222a41-1966-455e-afa1-2bb851b0c328,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38222a41-19') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.658 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.658 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.659 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.659 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.660 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0f33c8d3-dee0-59dd-b4d5-89e0e74bc6fc', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.661 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.662 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.665 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.666 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap38222a41-19, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.666 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap38222a41-19, col_values=(('qos', UUID('076da707-2634-4a59-b129-e316e1921a47')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.666 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap38222a41-19, col_values=(('external_ids', {'iface-id': '38222a41-1966-455e-afa1-2bb851b0c328', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:47:e2', 'vm-uuid': 'f6098ca7-42a8-4720-be83-d8dded5070c2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.667 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:34 compute-0 NetworkManager[55552]: <info>  [1764098494.6687] manager: (tap38222a41-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.669 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.674 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.675 187216 INFO os_vif [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:47:e2,bridge_name='br-int',has_traffic_filtering=True,id=38222a41-1966-455e-afa1-2bb851b0c328,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38222a41-19')
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.676 187216 DEBUG nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.676 187216 DEBUG nova.compute.manager [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsksj4kj6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6098ca7-42a8-4720-be83-d8dded5070c2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.677 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:34 compute-0 nova_compute[187212]: 2025-11-25 19:21:34.794 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:35 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:35.498 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:21:35 compute-0 nova_compute[187212]: 2025-11-25 19:21:35.500 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:35 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:35.500 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:21:35 compute-0 nova_compute[187212]: 2025-11-25 19:21:35.898 187216 DEBUG nova.network.neutron [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Port 38222a41-1966-455e-afa1-2bb851b0c328 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Nov 25 19:21:35 compute-0 nova_compute[187212]: 2025-11-25 19:21:35.913 187216 DEBUG nova.compute.manager [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsksj4kj6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6098ca7-42a8-4720-be83-d8dded5070c2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Nov 25 19:21:36 compute-0 podman[214414]: 2025-11-25 19:21:36.188460729 +0000 UTC m=+0.090123145 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:21:37 compute-0 nova_compute[187212]: 2025-11-25 19:21:37.732 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:38 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 25 19:21:38 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 25 19:21:39 compute-0 kernel: tap38222a41-19: entered promiscuous mode
Nov 25 19:21:39 compute-0 NetworkManager[55552]: <info>  [1764098499.0513] manager: (tap38222a41-19): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Nov 25 19:21:39 compute-0 ovn_controller[95465]: 2025-11-25T19:21:39Z|00120|binding|INFO|Claiming lport 38222a41-1966-455e-afa1-2bb851b0c328 for this additional chassis.
Nov 25 19:21:39 compute-0 ovn_controller[95465]: 2025-11-25T19:21:39Z|00121|binding|INFO|38222a41-1966-455e-afa1-2bb851b0c328: Claiming fa:16:3e:6a:47:e2 10.100.0.12
Nov 25 19:21:39 compute-0 nova_compute[187212]: 2025-11-25 19:21:39.055 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.063 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:47:e2 10.100.0.12'], port_security=['fa:16:3e:6a:47:e2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f6098ca7-42a8-4720-be83-d8dded5070c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'cd093ae9-737a-4a69-9f47-f2a7c74a9952', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa6a1d8f-d84f-49e7-84e1-a927297c44e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=38222a41-1966-455e-afa1-2bb851b0c328) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.064 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 38222a41-1966-455e-afa1-2bb851b0c328 in datapath 4c041141-ab86-4697-993b-67edbc4f2488 unbound from our chassis
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.066 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:21:39 compute-0 ovn_controller[95465]: 2025-11-25T19:21:39Z|00122|binding|INFO|Setting lport 38222a41-1966-455e-afa1-2bb851b0c328 ovn-installed in OVS
Nov 25 19:21:39 compute-0 nova_compute[187212]: 2025-11-25 19:21:39.086 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:39 compute-0 nova_compute[187212]: 2025-11-25 19:21:39.087 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.089 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb85878-f251-4d70-972c-0e8dcd685eb0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:39 compute-0 systemd-udevd[214475]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:21:39 compute-0 systemd-machined[153494]: New machine qemu-11-instance-0000000c.
Nov 25 19:21:39 compute-0 NetworkManager[55552]: <info>  [1764098499.1285] device (tap38222a41-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:21:39 compute-0 NetworkManager[55552]: <info>  [1764098499.1299] device (tap38222a41-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.136 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[fa823759-81c9-49d5-8234-f05b95a5d5d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:39 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000c.
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.139 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf84897-f61d-4f84-b35d-f99a4a72e829]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.178 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[fb179007-ef10-4637-a28c-0d61070e2608]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.206 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0ce34c-abc4-4a1c-9df6-d56d58398e32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c041141-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:23:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441071, 'reachable_time': 20794, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214481, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.229 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1d39d75f-5a6c-4b37-bdb9-ce09ea0d1e05]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4c041141-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441085, 'tstamp': 441085}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214486, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4c041141-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441090, 'tstamp': 441090}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214486, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.231 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c041141-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:39 compute-0 nova_compute[187212]: 2025-11-25 19:21:39.233 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:39 compute-0 nova_compute[187212]: 2025-11-25 19:21:39.234 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.235 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c041141-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.235 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.235 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4c041141-a0, col_values=(('external_ids', {'iface-id': '9941ceeb-16f5-4a0e-8227-c1de720c5499'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.236 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:21:39 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:39.238 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[3a134d22-e2f4-4670-acb2-3beddbd914b8]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-4c041141-ab86-4697-993b-67edbc4f2488\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 4c041141-ab86-4697-993b-67edbc4f2488\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:39 compute-0 nova_compute[187212]: 2025-11-25 19:21:39.667 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:41 compute-0 ovn_controller[95465]: 2025-11-25T19:21:41Z|00123|binding|INFO|Claiming lport 38222a41-1966-455e-afa1-2bb851b0c328 for this chassis.
Nov 25 19:21:41 compute-0 ovn_controller[95465]: 2025-11-25T19:21:41Z|00124|binding|INFO|38222a41-1966-455e-afa1-2bb851b0c328: Claiming fa:16:3e:6a:47:e2 10.100.0.12
Nov 25 19:21:41 compute-0 ovn_controller[95465]: 2025-11-25T19:21:41Z|00125|binding|INFO|Setting lport 38222a41-1966-455e-afa1-2bb851b0c328 up in Southbound
Nov 25 19:21:42 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:42.502 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:42 compute-0 nova_compute[187212]: 2025-11-25 19:21:42.773 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:43 compute-0 podman[214504]: 2025-11-25 19:21:43.239259589 +0000 UTC m=+0.149975828 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Nov 25 19:21:43 compute-0 nova_compute[187212]: 2025-11-25 19:21:43.551 187216 INFO nova.compute.manager [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Post operation of migration started
Nov 25 19:21:43 compute-0 nova_compute[187212]: 2025-11-25 19:21:43.552 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:44 compute-0 nova_compute[187212]: 2025-11-25 19:21:44.435 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:44 compute-0 nova_compute[187212]: 2025-11-25 19:21:44.436 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:44 compute-0 nova_compute[187212]: 2025-11-25 19:21:44.554 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-f6098ca7-42a8-4720-be83-d8dded5070c2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:21:44 compute-0 nova_compute[187212]: 2025-11-25 19:21:44.555 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-f6098ca7-42a8-4720-be83-d8dded5070c2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:21:44 compute-0 nova_compute[187212]: 2025-11-25 19:21:44.555 187216 DEBUG nova.network.neutron [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:21:44 compute-0 nova_compute[187212]: 2025-11-25 19:21:44.670 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:45 compute-0 nova_compute[187212]: 2025-11-25 19:21:45.064 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:45 compute-0 nova_compute[187212]: 2025-11-25 19:21:45.840 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:46 compute-0 nova_compute[187212]: 2025-11-25 19:21:46.057 187216 DEBUG nova.network.neutron [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Updating instance_info_cache with network_info: [{"id": "38222a41-1966-455e-afa1-2bb851b0c328", "address": "fa:16:3e:6a:47:e2", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38222a41-19", "ovs_interfaceid": "38222a41-1966-455e-afa1-2bb851b0c328", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:21:46 compute-0 podman[214530]: 2025-11-25 19:21:46.160912993 +0000 UTC m=+0.080969362 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Nov 25 19:21:46 compute-0 nova_compute[187212]: 2025-11-25 19:21:46.567 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-f6098ca7-42a8-4720-be83-d8dded5070c2" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:21:47 compute-0 nova_compute[187212]: 2025-11-25 19:21:47.092 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:47 compute-0 nova_compute[187212]: 2025-11-25 19:21:47.093 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:47 compute-0 nova_compute[187212]: 2025-11-25 19:21:47.093 187216 DEBUG oslo_concurrency.lockutils [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:47 compute-0 nova_compute[187212]: 2025-11-25 19:21:47.099 187216 INFO nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 25 19:21:47 compute-0 virtqemud[186888]: Domain id=11 name='instance-0000000c' uuid=f6098ca7-42a8-4720-be83-d8dded5070c2 is tainted: custom-monitor
Nov 25 19:21:47 compute-0 nova_compute[187212]: 2025-11-25 19:21:47.776 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:48 compute-0 nova_compute[187212]: 2025-11-25 19:21:48.105 187216 INFO nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 25 19:21:49 compute-0 nova_compute[187212]: 2025-11-25 19:21:49.114 187216 INFO nova.virt.libvirt.driver [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 25 19:21:49 compute-0 nova_compute[187212]: 2025-11-25 19:21:49.125 187216 DEBUG nova.compute.manager [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:21:49 compute-0 nova_compute[187212]: 2025-11-25 19:21:49.641 187216 DEBUG nova.objects.instance [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Nov 25 19:21:49 compute-0 nova_compute[187212]: 2025-11-25 19:21:49.674 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:50 compute-0 nova_compute[187212]: 2025-11-25 19:21:50.666 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:51 compute-0 nova_compute[187212]: 2025-11-25 19:21:51.463 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:51 compute-0 nova_compute[187212]: 2025-11-25 19:21:51.464 187216 WARNING neutronclient.v2_0.client [None req-f64c27b2-b9d8-4287-b612-59567dabd35c 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:52 compute-0 podman[214549]: 2025-11-25 19:21:52.191809755 +0000 UTC m=+0.099817971 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal)
Nov 25 19:21:52 compute-0 nova_compute[187212]: 2025-11-25 19:21:52.779 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:54 compute-0 nova_compute[187212]: 2025-11-25 19:21:54.676 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:55 compute-0 nova_compute[187212]: 2025-11-25 19:21:55.547 187216 DEBUG oslo_concurrency.lockutils [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:55 compute-0 nova_compute[187212]: 2025-11-25 19:21:55.548 187216 DEBUG oslo_concurrency.lockutils [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:55 compute-0 nova_compute[187212]: 2025-11-25 19:21:55.549 187216 DEBUG oslo_concurrency.lockutils [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:55 compute-0 nova_compute[187212]: 2025-11-25 19:21:55.549 187216 DEBUG oslo_concurrency.lockutils [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:55 compute-0 nova_compute[187212]: 2025-11-25 19:21:55.550 187216 DEBUG oslo_concurrency.lockutils [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:55 compute-0 nova_compute[187212]: 2025-11-25 19:21:55.701 187216 INFO nova.compute.manager [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Terminating instance
Nov 25 19:21:56 compute-0 podman[214570]: 2025-11-25 19:21:56.201718647 +0000 UTC m=+0.119603825 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 19:21:56 compute-0 nova_compute[187212]: 2025-11-25 19:21:56.298 187216 DEBUG nova.compute.manager [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:21:56 compute-0 kernel: tap059368de-93 (unregistering): left promiscuous mode
Nov 25 19:21:56 compute-0 NetworkManager[55552]: <info>  [1764098516.3309] device (tap059368de-93): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:21:56 compute-0 ovn_controller[95465]: 2025-11-25T19:21:56Z|00126|binding|INFO|Releasing lport 059368de-9300-4ad9-a662-f1bd38b00db6 from this chassis (sb_readonly=0)
Nov 25 19:21:56 compute-0 nova_compute[187212]: 2025-11-25 19:21:56.372 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:56 compute-0 ovn_controller[95465]: 2025-11-25T19:21:56Z|00127|binding|INFO|Setting lport 059368de-9300-4ad9-a662-f1bd38b00db6 down in Southbound
Nov 25 19:21:56 compute-0 ovn_controller[95465]: 2025-11-25T19:21:56Z|00128|binding|INFO|Removing iface tap059368de-93 ovn-installed in OVS
Nov 25 19:21:56 compute-0 nova_compute[187212]: 2025-11-25 19:21:56.377 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.383 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:b5:dd 10.100.0.10'], port_security=['fa:16:3e:9f:b5:dd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4ab49376-6ee4-40fa-b6ed-b289cfc5c43c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'cd093ae9-737a-4a69-9f47-f2a7c74a9952', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa6a1d8f-d84f-49e7-84e1-a927297c44e2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=059368de-9300-4ad9-a662-f1bd38b00db6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.383 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 059368de-9300-4ad9-a662-f1bd38b00db6 in datapath 4c041141-ab86-4697-993b-67edbc4f2488 unbound from our chassis
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.385 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:21:56 compute-0 nova_compute[187212]: 2025-11-25 19:21:56.398 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.411 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[8d154307-7fed-4491-be5a-82fd96463069]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:56 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 25 19:21:56 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Consumed 15.070s CPU time.
Nov 25 19:21:56 compute-0 systemd-machined[153494]: Machine qemu-10-instance-0000000d terminated.
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.462 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[caafb662-9075-4745-96b3-92873e2b8511]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.467 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[b3cde4d9-d859-4d3d-9a5f-867b6f1031d5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.511 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[15efe0e6-b854-42d5-b112-78a6be2af690]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.540 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[0657236d-b987-4d8b-9a1b-5d2d2aa05ec1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c041141-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:23:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441071, 'reachable_time': 20794, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214603, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.565 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[26ffc720-983a-42bc-9322-762baa15276e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4c041141-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441085, 'tstamp': 441085}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214613, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4c041141-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441090, 'tstamp': 441090}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214613, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.568 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c041141-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:56 compute-0 nova_compute[187212]: 2025-11-25 19:21:56.570 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.578 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c041141-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.578 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.579 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4c041141-a0, col_values=(('external_ids', {'iface-id': '9941ceeb-16f5-4a0e-8227-c1de720c5499'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:56 compute-0 nova_compute[187212]: 2025-11-25 19:21:56.579 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.579 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:21:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:21:56.581 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ca780e24-df59-46fe-af7f-908a54cba881]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-4c041141-ab86-4697-993b-67edbc4f2488\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 4c041141-ab86-4697-993b-67edbc4f2488\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:21:56 compute-0 nova_compute[187212]: 2025-11-25 19:21:56.591 187216 INFO nova.virt.libvirt.driver [-] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Instance destroyed successfully.
Nov 25 19:21:56 compute-0 nova_compute[187212]: 2025-11-25 19:21:56.592 187216 DEBUG nova.objects.instance [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lazy-loading 'resources' on Instance uuid 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.100 187216 DEBUG nova.virt.libvirt.vif [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:20:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-426547155',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-426547155',id=13,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:21:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-mfla5az4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:21:04Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=4ab49376-6ee4-40fa-b6ed-b289cfc5c43c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.101 187216 DEBUG nova.network.os_vif_util [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converting VIF {"id": "059368de-9300-4ad9-a662-f1bd38b00db6", "address": "fa:16:3e:9f:b5:dd", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap059368de-93", "ovs_interfaceid": "059368de-9300-4ad9-a662-f1bd38b00db6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.102 187216 DEBUG nova.network.os_vif_util [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b5:dd,bridge_name='br-int',has_traffic_filtering=True,id=059368de-9300-4ad9-a662-f1bd38b00db6,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap059368de-93') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.102 187216 DEBUG os_vif [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b5:dd,bridge_name='br-int',has_traffic_filtering=True,id=059368de-9300-4ad9-a662-f1bd38b00db6,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap059368de-93') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.105 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.106 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap059368de-93, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.107 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.111 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.112 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.112 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=da53c874-8c71-41f3-a173-6bf3bde3a680) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.113 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.114 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.117 187216 INFO os_vif [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:b5:dd,bridge_name='br-int',has_traffic_filtering=True,id=059368de-9300-4ad9-a662-f1bd38b00db6,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap059368de-93')
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.118 187216 INFO nova.virt.libvirt.driver [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Deleting instance files /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c_del
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.119 187216 INFO nova.virt.libvirt.driver [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Deletion of /var/lib/nova/instances/4ab49376-6ee4-40fa-b6ed-b289cfc5c43c_del complete
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.349 187216 DEBUG nova.compute.manager [req-9fb44fbb-41d9-435e-a14d-c8dd0febd96c req-37f96f9c-e6e9-4f6c-b941-45da1dd80fff 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Received event network-vif-unplugged-059368de-9300-4ad9-a662-f1bd38b00db6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.349 187216 DEBUG oslo_concurrency.lockutils [req-9fb44fbb-41d9-435e-a14d-c8dd0febd96c req-37f96f9c-e6e9-4f6c-b941-45da1dd80fff 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.350 187216 DEBUG oslo_concurrency.lockutils [req-9fb44fbb-41d9-435e-a14d-c8dd0febd96c req-37f96f9c-e6e9-4f6c-b941-45da1dd80fff 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.350 187216 DEBUG oslo_concurrency.lockutils [req-9fb44fbb-41d9-435e-a14d-c8dd0febd96c req-37f96f9c-e6e9-4f6c-b941-45da1dd80fff 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.350 187216 DEBUG nova.compute.manager [req-9fb44fbb-41d9-435e-a14d-c8dd0febd96c req-37f96f9c-e6e9-4f6c-b941-45da1dd80fff 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] No waiting events found dispatching network-vif-unplugged-059368de-9300-4ad9-a662-f1bd38b00db6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.351 187216 DEBUG nova.compute.manager [req-9fb44fbb-41d9-435e-a14d-c8dd0febd96c req-37f96f9c-e6e9-4f6c-b941-45da1dd80fff 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Received event network-vif-unplugged-059368de-9300-4ad9-a662-f1bd38b00db6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.634 187216 INFO nova.compute.manager [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Took 1.34 seconds to destroy the instance on the hypervisor.
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.635 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.635 187216 DEBUG nova.compute.manager [-] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.635 187216 DEBUG nova.network.neutron [-] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.636 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:57 compute-0 nova_compute[187212]: 2025-11-25 19:21:57.781 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:21:58 compute-0 nova_compute[187212]: 2025-11-25 19:21:58.478 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:21:59 compute-0 nova_compute[187212]: 2025-11-25 19:21:59.506 187216 DEBUG nova.compute.manager [req-970fa800-7309-4e35-a216-4dcdc6a4f645 req-722e3f68-824d-4d07-a9e4-2c51ef4ef727 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Received event network-vif-unplugged-059368de-9300-4ad9-a662-f1bd38b00db6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:21:59 compute-0 nova_compute[187212]: 2025-11-25 19:21:59.507 187216 DEBUG oslo_concurrency.lockutils [req-970fa800-7309-4e35-a216-4dcdc6a4f645 req-722e3f68-824d-4d07-a9e4-2c51ef4ef727 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:21:59 compute-0 nova_compute[187212]: 2025-11-25 19:21:59.507 187216 DEBUG oslo_concurrency.lockutils [req-970fa800-7309-4e35-a216-4dcdc6a4f645 req-722e3f68-824d-4d07-a9e4-2c51ef4ef727 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:21:59 compute-0 nova_compute[187212]: 2025-11-25 19:21:59.508 187216 DEBUG oslo_concurrency.lockutils [req-970fa800-7309-4e35-a216-4dcdc6a4f645 req-722e3f68-824d-4d07-a9e4-2c51ef4ef727 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:21:59 compute-0 nova_compute[187212]: 2025-11-25 19:21:59.508 187216 DEBUG nova.compute.manager [req-970fa800-7309-4e35-a216-4dcdc6a4f645 req-722e3f68-824d-4d07-a9e4-2c51ef4ef727 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] No waiting events found dispatching network-vif-unplugged-059368de-9300-4ad9-a662-f1bd38b00db6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:21:59 compute-0 nova_compute[187212]: 2025-11-25 19:21:59.509 187216 DEBUG nova.compute.manager [req-970fa800-7309-4e35-a216-4dcdc6a4f645 req-722e3f68-824d-4d07-a9e4-2c51ef4ef727 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Received event network-vif-unplugged-059368de-9300-4ad9-a662-f1bd38b00db6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:21:59 compute-0 podman[197585]: time="2025-11-25T19:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:21:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:21:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3084 "" "Go-http-client/1.1"
Nov 25 19:22:00 compute-0 nova_compute[187212]: 2025-11-25 19:22:00.672 187216 DEBUG nova.compute.manager [req-3e678301-688b-40ef-8a7b-c88e5371a2a9 req-c8680e12-d212-4862-bb14-6a3838e51ed5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Received event network-vif-deleted-059368de-9300-4ad9-a662-f1bd38b00db6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:22:00 compute-0 nova_compute[187212]: 2025-11-25 19:22:00.673 187216 INFO nova.compute.manager [req-3e678301-688b-40ef-8a7b-c88e5371a2a9 req-c8680e12-d212-4862-bb14-6a3838e51ed5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Neutron deleted interface 059368de-9300-4ad9-a662-f1bd38b00db6; detaching it from the instance and deleting it from the info cache
Nov 25 19:22:00 compute-0 nova_compute[187212]: 2025-11-25 19:22:00.673 187216 DEBUG nova.network.neutron [req-3e678301-688b-40ef-8a7b-c88e5371a2a9 req-c8680e12-d212-4862-bb14-6a3838e51ed5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:22:01 compute-0 nova_compute[187212]: 2025-11-25 19:22:01.031 187216 DEBUG nova.network.neutron [-] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:22:01 compute-0 nova_compute[187212]: 2025-11-25 19:22:01.184 187216 DEBUG nova.compute.manager [req-3e678301-688b-40ef-8a7b-c88e5371a2a9 req-c8680e12-d212-4862-bb14-6a3838e51ed5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Detach interface failed, port_id=059368de-9300-4ad9-a662-f1bd38b00db6, reason: Instance 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:22:01 compute-0 openstack_network_exporter[199731]: ERROR   19:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:22:01 compute-0 openstack_network_exporter[199731]: ERROR   19:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:22:01 compute-0 openstack_network_exporter[199731]: ERROR   19:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:22:01 compute-0 openstack_network_exporter[199731]: ERROR   19:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:22:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:22:01 compute-0 openstack_network_exporter[199731]: ERROR   19:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:22:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:22:01 compute-0 nova_compute[187212]: 2025-11-25 19:22:01.544 187216 INFO nova.compute.manager [-] [instance: 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c] Took 3.91 seconds to deallocate network for instance.
Nov 25 19:22:02 compute-0 nova_compute[187212]: 2025-11-25 19:22:02.080 187216 DEBUG oslo_concurrency.lockutils [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:02 compute-0 nova_compute[187212]: 2025-11-25 19:22:02.081 187216 DEBUG oslo_concurrency.lockutils [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:02 compute-0 nova_compute[187212]: 2025-11-25 19:22:02.113 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:02 compute-0 nova_compute[187212]: 2025-11-25 19:22:02.173 187216 DEBUG nova.compute.provider_tree [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:22:02 compute-0 nova_compute[187212]: 2025-11-25 19:22:02.682 187216 DEBUG nova.scheduler.client.report [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:22:02 compute-0 nova_compute[187212]: 2025-11-25 19:22:02.785 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:03 compute-0 nova_compute[187212]: 2025-11-25 19:22:03.194 187216 DEBUG oslo_concurrency.lockutils [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:03 compute-0 nova_compute[187212]: 2025-11-25 19:22:03.230 187216 INFO nova.scheduler.client.report [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Deleted allocations for instance 4ab49376-6ee4-40fa-b6ed-b289cfc5c43c
Nov 25 19:22:04 compute-0 nova_compute[187212]: 2025-11-25 19:22:04.276 187216 DEBUG oslo_concurrency.lockutils [None req-4b97cf4c-0ed7-43a9-9015-887887f6eba0 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "4ab49376-6ee4-40fa-b6ed-b289cfc5c43c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.727s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.122 187216 DEBUG oslo_concurrency.lockutils [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "f6098ca7-42a8-4720-be83-d8dded5070c2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.123 187216 DEBUG oslo_concurrency.lockutils [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "f6098ca7-42a8-4720-be83-d8dded5070c2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.124 187216 DEBUG oslo_concurrency.lockutils [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "f6098ca7-42a8-4720-be83-d8dded5070c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.124 187216 DEBUG oslo_concurrency.lockutils [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "f6098ca7-42a8-4720-be83-d8dded5070c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.125 187216 DEBUG oslo_concurrency.lockutils [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "f6098ca7-42a8-4720-be83-d8dded5070c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.144 187216 INFO nova.compute.manager [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Terminating instance
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.668 187216 DEBUG nova.compute.manager [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:22:05 compute-0 kernel: tap38222a41-19 (unregistering): left promiscuous mode
Nov 25 19:22:05 compute-0 NetworkManager[55552]: <info>  [1764098525.6952] device (tap38222a41-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:22:05 compute-0 ovn_controller[95465]: 2025-11-25T19:22:05Z|00129|binding|INFO|Releasing lport 38222a41-1966-455e-afa1-2bb851b0c328 from this chassis (sb_readonly=0)
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.702 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:05 compute-0 ovn_controller[95465]: 2025-11-25T19:22:05Z|00130|binding|INFO|Setting lport 38222a41-1966-455e-afa1-2bb851b0c328 down in Southbound
Nov 25 19:22:05 compute-0 ovn_controller[95465]: 2025-11-25T19:22:05Z|00131|binding|INFO|Removing iface tap38222a41-19 ovn-installed in OVS
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.706 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:05.711 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:47:e2 10.100.0.12'], port_security=['fa:16:3e:6a:47:e2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f6098ca7-42a8-4720-be83-d8dded5070c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'cd093ae9-737a-4a69-9f47-f2a7c74a9952', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa6a1d8f-d84f-49e7-84e1-a927297c44e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=38222a41-1966-455e-afa1-2bb851b0c328) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:22:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:05.712 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 38222a41-1966-455e-afa1-2bb851b0c328 in datapath 4c041141-ab86-4697-993b-67edbc4f2488 unbound from our chassis
Nov 25 19:22:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:05.718 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c041141-ab86-4697-993b-67edbc4f2488, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:22:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:05.719 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[f5848c2f-9599-4e0d-8127-7c7eafb74e57]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:05.721 104356 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 namespace which is not needed anymore
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.723 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:05 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 25 19:22:05 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000c.scope: Consumed 2.876s CPU time.
Nov 25 19:22:05 compute-0 systemd-machined[153494]: Machine qemu-11-instance-0000000c terminated.
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.857 187216 DEBUG nova.compute.manager [req-9266e7b0-1043-411e-a77d-24087fbb9cfa req-8ac29715-ec16-4dda-bf94-146206b24db1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Received event network-vif-unplugged-38222a41-1966-455e-afa1-2bb851b0c328 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.858 187216 DEBUG oslo_concurrency.lockutils [req-9266e7b0-1043-411e-a77d-24087fbb9cfa req-8ac29715-ec16-4dda-bf94-146206b24db1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "f6098ca7-42a8-4720-be83-d8dded5070c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.859 187216 DEBUG oslo_concurrency.lockutils [req-9266e7b0-1043-411e-a77d-24087fbb9cfa req-8ac29715-ec16-4dda-bf94-146206b24db1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "f6098ca7-42a8-4720-be83-d8dded5070c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.860 187216 DEBUG oslo_concurrency.lockutils [req-9266e7b0-1043-411e-a77d-24087fbb9cfa req-8ac29715-ec16-4dda-bf94-146206b24db1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "f6098ca7-42a8-4720-be83-d8dded5070c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.860 187216 DEBUG nova.compute.manager [req-9266e7b0-1043-411e-a77d-24087fbb9cfa req-8ac29715-ec16-4dda-bf94-146206b24db1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] No waiting events found dispatching network-vif-unplugged-38222a41-1966-455e-afa1-2bb851b0c328 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.860 187216 DEBUG nova.compute.manager [req-9266e7b0-1043-411e-a77d-24087fbb9cfa req-8ac29715-ec16-4dda-bf94-146206b24db1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Received event network-vif-unplugged-38222a41-1966-455e-afa1-2bb851b0c328 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:22:05 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[214236]: [NOTICE]   (214240) : haproxy version is 3.0.5-8e879a5
Nov 25 19:22:05 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[214236]: [NOTICE]   (214240) : path to executable is /usr/sbin/haproxy
Nov 25 19:22:05 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[214236]: [WARNING]  (214240) : Exiting Master process...
Nov 25 19:22:05 compute-0 podman[214647]: 2025-11-25 19:22:05.911744271 +0000 UTC m=+0.044243192 container kill b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 19:22:05 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[214236]: [ALERT]    (214240) : Current worker (214242) exited with code 143 (Terminated)
Nov 25 19:22:05 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[214236]: [WARNING]  (214240) : All workers exited. Exiting... (0)
Nov 25 19:22:05 compute-0 systemd[1]: libpod-b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d.scope: Deactivated successfully.
Nov 25 19:22:05 compute-0 conmon[214236]: conmon b9f9441ee44e63186fd8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d.scope/container/memory.events
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.964 187216 INFO nova.virt.libvirt.driver [-] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Instance destroyed successfully.
Nov 25 19:22:05 compute-0 nova_compute[187212]: 2025-11-25 19:22:05.966 187216 DEBUG nova.objects.instance [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lazy-loading 'resources' on Instance uuid f6098ca7-42a8-4720-be83-d8dded5070c2 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:22:05 compute-0 podman[214670]: 2025-11-25 19:22:05.981590618 +0000 UTC m=+0.041853158 container died b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:22:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-d8019e5a190d873e9c0aae319220dde3f36d7c09476638025b4ed75b571168f7-merged.mount: Deactivated successfully.
Nov 25 19:22:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d-userdata-shm.mount: Deactivated successfully.
Nov 25 19:22:06 compute-0 podman[214670]: 2025-11-25 19:22:06.033187493 +0000 UTC m=+0.093449993 container cleanup b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:22:06 compute-0 systemd[1]: libpod-conmon-b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d.scope: Deactivated successfully.
Nov 25 19:22:06 compute-0 podman[214676]: 2025-11-25 19:22:06.057200108 +0000 UTC m=+0.099418140 container remove b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.066 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[29400f64-2ce1-48f2-b582-5433140d8da1]: (4, ("Tue Nov 25 07:22:05 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 (b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d)\nb9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d\nTue Nov 25 07:22:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 (b9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d)\nb9f9441ee44e63186fd8bb1715a570bb3a22c02dc522d69a4696afef250a160d\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.068 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c4efa306-6665-4d2d-9210-c5d2545e0cfd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.068 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.069 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ced223f6-bec1-43ac-a4e2-db4a2d38cb58]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.070 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c041141-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:06 compute-0 kernel: tap4c041141-a0: left promiscuous mode
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.074 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.104 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.107 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[52d1b342-915d-4440-b8b1-c17d5ed6b2e4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.126 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[dafd2211-13d8-4667-a34c-36fb4076fb3f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.127 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4cb8bb-3035-42c9-9f6a-0f9e4afeb7e2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.153 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[4b74fe4b-5059-4be6-b46c-03df7497b09f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441062, 'reachable_time': 42270, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214713, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.156 104475 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 19:22:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:06.156 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[f7766bfd-e843-4f41-a2a9-70b0ce5d9a9b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d4c041141\x2dab86\x2d4697\x2d993b\x2d67edbc4f2488.mount: Deactivated successfully.
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.474 187216 DEBUG nova.virt.libvirt.vif [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2025-11-25T19:20:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1516267269',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1516267269',id=12,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:20:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-0s0cry0q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',clean_attempts='1',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:21:50Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=f6098ca7-42a8-4720-be83-d8dded5070c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "38222a41-1966-455e-afa1-2bb851b0c328", "address": "fa:16:3e:6a:47:e2", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38222a41-19", "ovs_interfaceid": "38222a41-1966-455e-afa1-2bb851b0c328", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.475 187216 DEBUG nova.network.os_vif_util [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converting VIF {"id": "38222a41-1966-455e-afa1-2bb851b0c328", "address": "fa:16:3e:6a:47:e2", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38222a41-19", "ovs_interfaceid": "38222a41-1966-455e-afa1-2bb851b0c328", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.477 187216 DEBUG nova.network.os_vif_util [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:47:e2,bridge_name='br-int',has_traffic_filtering=True,id=38222a41-1966-455e-afa1-2bb851b0c328,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38222a41-19') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.478 187216 DEBUG os_vif [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:47:e2,bridge_name='br-int',has_traffic_filtering=True,id=38222a41-1966-455e-afa1-2bb851b0c328,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38222a41-19') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.480 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.480 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38222a41-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.506 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.509 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.510 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.510 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=076da707-2634-4a59-b129-e316e1921a47) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.513 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.516 187216 INFO os_vif [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:47:e2,bridge_name='br-int',has_traffic_filtering=True,id=38222a41-1966-455e-afa1-2bb851b0c328,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38222a41-19')
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.517 187216 INFO nova.virt.libvirt.driver [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Deleting instance files /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2_del
Nov 25 19:22:06 compute-0 nova_compute[187212]: 2025-11-25 19:22:06.518 187216 INFO nova.virt.libvirt.driver [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Deletion of /var/lib/nova/instances/f6098ca7-42a8-4720-be83-d8dded5070c2_del complete
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.034 187216 INFO nova.compute.manager [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Took 1.36 seconds to destroy the instance on the hypervisor.
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.034 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.035 187216 DEBUG nova.compute.manager [-] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.035 187216 DEBUG nova.network.neutron [-] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.036 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:22:07 compute-0 podman[214714]: 2025-11-25 19:22:07.177898824 +0000 UTC m=+0.093158425 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.493 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.787 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.920 187216 DEBUG nova.compute.manager [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Received event network-vif-unplugged-38222a41-1966-455e-afa1-2bb851b0c328 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.920 187216 DEBUG oslo_concurrency.lockutils [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "f6098ca7-42a8-4720-be83-d8dded5070c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.921 187216 DEBUG oslo_concurrency.lockutils [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "f6098ca7-42a8-4720-be83-d8dded5070c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.921 187216 DEBUG oslo_concurrency.lockutils [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "f6098ca7-42a8-4720-be83-d8dded5070c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.921 187216 DEBUG nova.compute.manager [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] No waiting events found dispatching network-vif-unplugged-38222a41-1966-455e-afa1-2bb851b0c328 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.922 187216 DEBUG nova.compute.manager [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Received event network-vif-unplugged-38222a41-1966-455e-afa1-2bb851b0c328 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.922 187216 DEBUG nova.compute.manager [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Received event network-vif-deleted-38222a41-1966-455e-afa1-2bb851b0c328 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.922 187216 INFO nova.compute.manager [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Neutron deleted interface 38222a41-1966-455e-afa1-2bb851b0c328; detaching it from the instance and deleting it from the info cache
Nov 25 19:22:07 compute-0 nova_compute[187212]: 2025-11-25 19:22:07.922 187216 DEBUG nova.network.neutron [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:22:08 compute-0 nova_compute[187212]: 2025-11-25 19:22:08.280 187216 DEBUG nova.network.neutron [-] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:22:08 compute-0 nova_compute[187212]: 2025-11-25 19:22:08.431 187216 DEBUG nova.compute.manager [req-0533e3e6-0fa5-4652-be40-1a902109a74e req-97b45646-ffb1-4492-b92e-5014bc524a16 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Detach interface failed, port_id=38222a41-1966-455e-afa1-2bb851b0c328, reason: Instance f6098ca7-42a8-4720-be83-d8dded5070c2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:22:08 compute-0 nova_compute[187212]: 2025-11-25 19:22:08.787 187216 INFO nova.compute.manager [-] [instance: f6098ca7-42a8-4720-be83-d8dded5070c2] Took 1.75 seconds to deallocate network for instance.
Nov 25 19:22:09 compute-0 nova_compute[187212]: 2025-11-25 19:22:09.326 187216 DEBUG oslo_concurrency.lockutils [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:09 compute-0 nova_compute[187212]: 2025-11-25 19:22:09.327 187216 DEBUG oslo_concurrency.lockutils [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:09 compute-0 nova_compute[187212]: 2025-11-25 19:22:09.333 187216 DEBUG oslo_concurrency.lockutils [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:09 compute-0 nova_compute[187212]: 2025-11-25 19:22:09.371 187216 INFO nova.scheduler.client.report [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Deleted allocations for instance f6098ca7-42a8-4720-be83-d8dded5070c2
Nov 25 19:22:10 compute-0 nova_compute[187212]: 2025-11-25 19:22:10.404 187216 DEBUG oslo_concurrency.lockutils [None req-50d7eaf3-038e-4ac6-874c-c05c5cc50ab9 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "f6098ca7-42a8-4720-be83-d8dded5070c2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.280s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:11 compute-0 nova_compute[187212]: 2025-11-25 19:22:11.512 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:12 compute-0 nova_compute[187212]: 2025-11-25 19:22:12.790 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:14 compute-0 podman[214739]: 2025-11-25 19:22:14.233748738 +0000 UTC m=+0.148931271 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Nov 25 19:22:16 compute-0 nova_compute[187212]: 2025-11-25 19:22:16.518 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:17 compute-0 podman[214767]: 2025-11-25 19:22:17.172662838 +0000 UTC m=+0.079870073 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 19:22:17 compute-0 nova_compute[187212]: 2025-11-25 19:22:17.792 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:20 compute-0 nova_compute[187212]: 2025-11-25 19:22:20.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:22:20 compute-0 nova_compute[187212]: 2025-11-25 19:22:20.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:22:21 compute-0 nova_compute[187212]: 2025-11-25 19:22:21.561 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:22 compute-0 nova_compute[187212]: 2025-11-25 19:22:22.845 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:23 compute-0 podman[214785]: 2025-11-25 19:22:23.151936415 +0000 UTC m=+0.073487755 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7)
Nov 25 19:22:25 compute-0 nova_compute[187212]: 2025-11-25 19:22:25.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:22:26 compute-0 nova_compute[187212]: 2025-11-25 19:22:26.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:22:26 compute-0 nova_compute[187212]: 2025-11-25 19:22:26.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:22:26 compute-0 nova_compute[187212]: 2025-11-25 19:22:26.564 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:26 compute-0 nova_compute[187212]: 2025-11-25 19:22:26.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:26 compute-0 nova_compute[187212]: 2025-11-25 19:22:26.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:26 compute-0 nova_compute[187212]: 2025-11-25 19:22:26.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:26 compute-0 nova_compute[187212]: 2025-11-25 19:22:26.692 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:22:26 compute-0 podman[214807]: 2025-11-25 19:22:26.864635974 +0000 UTC m=+0.106574540 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 19:22:26 compute-0 nova_compute[187212]: 2025-11-25 19:22:26.962 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:22:26 compute-0 nova_compute[187212]: 2025-11-25 19:22:26.964 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:22:27 compute-0 nova_compute[187212]: 2025-11-25 19:22:27.007 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:22:27 compute-0 nova_compute[187212]: 2025-11-25 19:22:27.008 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5807MB free_disk=72.99283218383789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:22:27 compute-0 nova_compute[187212]: 2025-11-25 19:22:27.008 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:27 compute-0 nova_compute[187212]: 2025-11-25 19:22:27.009 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:27 compute-0 nova_compute[187212]: 2025-11-25 19:22:27.849 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:28 compute-0 nova_compute[187212]: 2025-11-25 19:22:28.070 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:22:28 compute-0 nova_compute[187212]: 2025-11-25 19:22:28.071 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:22:26 up  1:14,  0 user,  load average: 0.27, 0.40, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:22:28 compute-0 nova_compute[187212]: 2025-11-25 19:22:28.095 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:22:28 compute-0 nova_compute[187212]: 2025-11-25 19:22:28.604 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:22:29 compute-0 nova_compute[187212]: 2025-11-25 19:22:29.118 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:22:29 compute-0 nova_compute[187212]: 2025-11-25 19:22:29.119 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:29 compute-0 podman[197585]: time="2025-11-25T19:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:22:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:22:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2621 "" "Go-http-client/1.1"
Nov 25 19:22:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:31.099 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:31.099 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:31.099 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:31 compute-0 openstack_network_exporter[199731]: ERROR   19:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:22:31 compute-0 openstack_network_exporter[199731]: ERROR   19:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:22:31 compute-0 openstack_network_exporter[199731]: ERROR   19:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:22:31 compute-0 openstack_network_exporter[199731]: ERROR   19:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:22:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:22:31 compute-0 openstack_network_exporter[199731]: ERROR   19:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:22:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:22:31 compute-0 nova_compute[187212]: 2025-11-25 19:22:31.566 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:32 compute-0 nova_compute[187212]: 2025-11-25 19:22:32.890 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:33 compute-0 nova_compute[187212]: 2025-11-25 19:22:33.120 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:22:33 compute-0 nova_compute[187212]: 2025-11-25 19:22:33.121 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:22:33 compute-0 nova_compute[187212]: 2025-11-25 19:22:33.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:22:34 compute-0 nova_compute[187212]: 2025-11-25 19:22:34.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:22:35 compute-0 nova_compute[187212]: 2025-11-25 19:22:35.409 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:35 compute-0 nova_compute[187212]: 2025-11-25 19:22:35.410 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:35 compute-0 nova_compute[187212]: 2025-11-25 19:22:35.919 187216 DEBUG nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:22:36 compute-0 nova_compute[187212]: 2025-11-25 19:22:36.471 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:36 compute-0 nova_compute[187212]: 2025-11-25 19:22:36.472 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:36 compute-0 nova_compute[187212]: 2025-11-25 19:22:36.479 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:22:36 compute-0 nova_compute[187212]: 2025-11-25 19:22:36.480 187216 INFO nova.compute.claims [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:22:36 compute-0 nova_compute[187212]: 2025-11-25 19:22:36.602 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:37 compute-0 nova_compute[187212]: 2025-11-25 19:22:37.539 187216 DEBUG nova.compute.provider_tree [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:22:37 compute-0 nova_compute[187212]: 2025-11-25 19:22:37.928 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:38 compute-0 nova_compute[187212]: 2025-11-25 19:22:38.047 187216 DEBUG nova.scheduler.client.report [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:22:38 compute-0 podman[214830]: 2025-11-25 19:22:38.053318931 +0000 UTC m=+0.085008418 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:22:38 compute-0 nova_compute[187212]: 2025-11-25 19:22:38.559 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.087s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:38 compute-0 nova_compute[187212]: 2025-11-25 19:22:38.561 187216 DEBUG nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:22:39 compute-0 nova_compute[187212]: 2025-11-25 19:22:39.078 187216 DEBUG nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:22:39 compute-0 nova_compute[187212]: 2025-11-25 19:22:39.078 187216 DEBUG nova.network.neutron [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:22:39 compute-0 nova_compute[187212]: 2025-11-25 19:22:39.079 187216 WARNING neutronclient.v2_0.client [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:22:39 compute-0 nova_compute[187212]: 2025-11-25 19:22:39.080 187216 WARNING neutronclient.v2_0.client [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:22:39 compute-0 nova_compute[187212]: 2025-11-25 19:22:39.588 187216 INFO nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:22:40 compute-0 nova_compute[187212]: 2025-11-25 19:22:40.098 187216 DEBUG nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:22:40 compute-0 nova_compute[187212]: 2025-11-25 19:22:40.601 187216 DEBUG nova.network.neutron [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Successfully created port: f8f39500-5b73-4257-af08-30ed674e5d0c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.119 187216 DEBUG nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.120 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.121 187216 INFO nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Creating image(s)
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.122 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "/var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.122 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "/var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.124 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "/var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.125 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.131 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.133 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.219 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.221 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.221 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.222 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.229 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.230 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.322 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.324 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.347 187216 DEBUG nova.network.neutron [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Successfully updated port: f8f39500-5b73-4257-af08-30ed674e5d0c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.370 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.371 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.372 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.443 187216 DEBUG nova.compute.manager [req-689eeebc-1f02-4ee7-a89b-d7560048ae49 req-1ecdf330-04b0-4450-a774-10ce80d97a8d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Received event network-changed-f8f39500-5b73-4257-af08-30ed674e5d0c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.444 187216 DEBUG nova.compute.manager [req-689eeebc-1f02-4ee7-a89b-d7560048ae49 req-1ecdf330-04b0-4450-a774-10ce80d97a8d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Refreshing instance network info cache due to event network-changed-f8f39500-5b73-4257-af08-30ed674e5d0c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.444 187216 DEBUG oslo_concurrency.lockutils [req-689eeebc-1f02-4ee7-a89b-d7560048ae49 req-1ecdf330-04b0-4450-a774-10ce80d97a8d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-1140b061-ca3b-44fb-9523-49b86ac5c5e8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.445 187216 DEBUG oslo_concurrency.lockutils [req-689eeebc-1f02-4ee7-a89b-d7560048ae49 req-1ecdf330-04b0-4450-a774-10ce80d97a8d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-1140b061-ca3b-44fb-9523-49b86ac5c5e8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.445 187216 DEBUG nova.network.neutron [req-689eeebc-1f02-4ee7-a89b-d7560048ae49 req-1ecdf330-04b0-4450-a774-10ce80d97a8d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Refreshing network info cache for port f8f39500-5b73-4257-af08-30ed674e5d0c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.462 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.463 187216 DEBUG nova.virt.disk.api [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Checking if we can resize image /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.463 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.531 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.533 187216 DEBUG nova.virt.disk.api [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Cannot resize image /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.533 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.534 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Ensure instance console log exists: /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.535 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.535 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.536 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.645 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.855 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "refresh_cache-1140b061-ca3b-44fb-9523-49b86ac5c5e8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:22:41 compute-0 nova_compute[187212]: 2025-11-25 19:22:41.954 187216 WARNING neutronclient.v2_0.client [req-689eeebc-1f02-4ee7-a89b-d7560048ae49 req-1ecdf330-04b0-4450-a774-10ce80d97a8d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:22:42 compute-0 nova_compute[187212]: 2025-11-25 19:22:42.516 187216 DEBUG nova.network.neutron [req-689eeebc-1f02-4ee7-a89b-d7560048ae49 req-1ecdf330-04b0-4450-a774-10ce80d97a8d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:22:42 compute-0 nova_compute[187212]: 2025-11-25 19:22:42.751 187216 DEBUG nova.network.neutron [req-689eeebc-1f02-4ee7-a89b-d7560048ae49 req-1ecdf330-04b0-4450-a774-10ce80d97a8d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:22:42 compute-0 nova_compute[187212]: 2025-11-25 19:22:42.978 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:43 compute-0 nova_compute[187212]: 2025-11-25 19:22:43.259 187216 DEBUG oslo_concurrency.lockutils [req-689eeebc-1f02-4ee7-a89b-d7560048ae49 req-1ecdf330-04b0-4450-a774-10ce80d97a8d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-1140b061-ca3b-44fb-9523-49b86ac5c5e8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:22:43 compute-0 nova_compute[187212]: 2025-11-25 19:22:43.260 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquired lock "refresh_cache-1140b061-ca3b-44fb-9523-49b86ac5c5e8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:22:43 compute-0 nova_compute[187212]: 2025-11-25 19:22:43.260 187216 DEBUG nova.network.neutron [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:22:44 compute-0 nova_compute[187212]: 2025-11-25 19:22:44.432 187216 DEBUG nova.network.neutron [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:22:44 compute-0 nova_compute[187212]: 2025-11-25 19:22:44.640 187216 WARNING neutronclient.v2_0.client [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:22:44 compute-0 nova_compute[187212]: 2025-11-25 19:22:44.801 187216 DEBUG nova.network.neutron [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Updating instance_info_cache with network_info: [{"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:22:45 compute-0 podman[214869]: 2025-11-25 19:22:45.242366359 +0000 UTC m=+0.157580138 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.311 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Releasing lock "refresh_cache-1140b061-ca3b-44fb-9523-49b86ac5c5e8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.312 187216 DEBUG nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Instance network_info: |[{"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.314 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Start _get_guest_xml network_info=[{"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.320 187216 WARNING nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.322 187216 DEBUG nova.virt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-503570234', uuid='1140b061-ca3b-44fb-9523-49b86ac5c5e8'), owner=OwnerMeta(userid='b86907256ac0401183dd8a2c5394fe31', username='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin', projectid='01a0280ccebb48a888956426fb3d2015', projectname='tempest-TestExecuteHostMaintenanceStrategy-1349736763'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764098565.3221765) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.328 187216 DEBUG nova.virt.libvirt.host [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.329 187216 DEBUG nova.virt.libvirt.host [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.334 187216 DEBUG nova.virt.libvirt.host [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.335 187216 DEBUG nova.virt.libvirt.host [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.337 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.337 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.338 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.338 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.338 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.339 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.339 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.339 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.340 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.340 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.340 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.341 187216 DEBUG nova.virt.hardware [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.348 187216 DEBUG nova.virt.libvirt.vif [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:22:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-503570234',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-503570234',id=15,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-40hf05ys',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:22:40Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=1140b061-ca3b-44fb-9523-49b86ac5c5e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.349 187216 DEBUG nova.network.os_vif_util [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converting VIF {"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.350 187216 DEBUG nova.network.os_vif_util [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:b1:fe,bridge_name='br-int',has_traffic_filtering=True,id=f8f39500-5b73-4257-af08-30ed674e5d0c,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8f39500-5b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.351 187216 DEBUG nova.objects.instance [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1140b061-ca3b-44fb-9523-49b86ac5c5e8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.860 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <uuid>1140b061-ca3b-44fb-9523-49b86ac5c5e8</uuid>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <name>instance-0000000f</name>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-503570234</nova:name>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:22:45</nova:creationTime>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:22:45 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:22:45 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:user uuid="b86907256ac0401183dd8a2c5394fe31">tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin</nova:user>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:project uuid="01a0280ccebb48a888956426fb3d2015">tempest-TestExecuteHostMaintenanceStrategy-1349736763</nova:project>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         <nova:port uuid="f8f39500-5b73-4257-af08-30ed674e5d0c">
Nov 25 19:22:45 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <system>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <entry name="serial">1140b061-ca3b-44fb-9523-49b86ac5c5e8</entry>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <entry name="uuid">1140b061-ca3b-44fb-9523-49b86ac5c5e8</entry>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     </system>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <os>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   </os>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <features>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   </features>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk.config"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:19:b1:fe"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <target dev="tapf8f39500-5b"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/console.log" append="off"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <video>
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     </video>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:22:45 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:22:45 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:22:45 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:22:45 compute-0 nova_compute[187212]: </domain>
Nov 25 19:22:45 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.863 187216 DEBUG nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Preparing to wait for external event network-vif-plugged-f8f39500-5b73-4257-af08-30ed674e5d0c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.864 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.864 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.864 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.866 187216 DEBUG nova.virt.libvirt.vif [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:22:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-503570234',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-503570234',id=15,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-40hf05ys',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:22:40Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=1140b061-ca3b-44fb-9523-49b86ac5c5e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.866 187216 DEBUG nova.network.os_vif_util [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converting VIF {"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.867 187216 DEBUG nova.network.os_vif_util [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:b1:fe,bridge_name='br-int',has_traffic_filtering=True,id=f8f39500-5b73-4257-af08-30ed674e5d0c,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8f39500-5b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.868 187216 DEBUG os_vif [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:b1:fe,bridge_name='br-int',has_traffic_filtering=True,id=f8f39500-5b73-4257-af08-30ed674e5d0c,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8f39500-5b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.869 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.870 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.870 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.872 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.872 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '03653be7-5100-5f86-909e-1d4a392ace84', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.874 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.877 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.881 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.881 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8f39500-5b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.882 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf8f39500-5b, col_values=(('qos', UUID('dd90d0ca-be6e-406e-bc90-f2993ea2f75a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.883 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf8f39500-5b, col_values=(('external_ids', {'iface-id': 'f8f39500-5b73-4257-af08-30ed674e5d0c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:b1:fe', 'vm-uuid': '1140b061-ca3b-44fb-9523-49b86ac5c5e8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.884 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:45 compute-0 NetworkManager[55552]: <info>  [1764098565.8863] manager: (tapf8f39500-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.889 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.895 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:45 compute-0 nova_compute[187212]: 2025-11-25 19:22:45.896 187216 INFO os_vif [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:b1:fe,bridge_name='br-int',has_traffic_filtering=True,id=f8f39500-5b73-4257-af08-30ed674e5d0c,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8f39500-5b')
Nov 25 19:22:47 compute-0 nova_compute[187212]: 2025-11-25 19:22:47.451 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:22:47 compute-0 nova_compute[187212]: 2025-11-25 19:22:47.451 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:22:47 compute-0 nova_compute[187212]: 2025-11-25 19:22:47.452 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] No VIF found with MAC fa:16:3e:19:b1:fe, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:22:47 compute-0 nova_compute[187212]: 2025-11-25 19:22:47.452 187216 INFO nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Using config drive
Nov 25 19:22:47 compute-0 nova_compute[187212]: 2025-11-25 19:22:47.966 187216 WARNING neutronclient.v2_0.client [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.017 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:48 compute-0 podman[214898]: 2025-11-25 19:22:48.463712642 +0000 UTC m=+0.083040468 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.482 187216 INFO nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Creating config drive at /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk.config
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.492 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp8_i5zdxe execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.634 187216 DEBUG oslo_concurrency.processutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp8_i5zdxe" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:22:48 compute-0 kernel: tapf8f39500-5b: entered promiscuous mode
Nov 25 19:22:48 compute-0 NetworkManager[55552]: <info>  [1764098568.7354] manager: (tapf8f39500-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Nov 25 19:22:48 compute-0 ovn_controller[95465]: 2025-11-25T19:22:48Z|00132|binding|INFO|Claiming lport f8f39500-5b73-4257-af08-30ed674e5d0c for this chassis.
Nov 25 19:22:48 compute-0 ovn_controller[95465]: 2025-11-25T19:22:48Z|00133|binding|INFO|f8f39500-5b73-4257-af08-30ed674e5d0c: Claiming fa:16:3e:19:b1:fe 10.100.0.6
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.738 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.744 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:b1:fe 10.100.0.6'], port_security=['fa:16:3e:19:b1:fe 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1140b061-ca3b-44fb-9523-49b86ac5c5e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cd093ae9-737a-4a69-9f47-f2a7c74a9952', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa6a1d8f-d84f-49e7-84e1-a927297c44e2, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=f8f39500-5b73-4257-af08-30ed674e5d0c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.745 104356 INFO neutron.agent.ovn.metadata.agent [-] Port f8f39500-5b73-4257-af08-30ed674e5d0c in datapath 4c041141-ab86-4697-993b-67edbc4f2488 bound to our chassis
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.747 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.753 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:48 compute-0 ovn_controller[95465]: 2025-11-25T19:22:48Z|00134|binding|INFO|Setting lport f8f39500-5b73-4257-af08-30ed674e5d0c ovn-installed in OVS
Nov 25 19:22:48 compute-0 ovn_controller[95465]: 2025-11-25T19:22:48Z|00135|binding|INFO|Setting lport f8f39500-5b73-4257-af08-30ed674e5d0c up in Southbound
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.757 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.759 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.770 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[52830d71-3fb8-4c04-82b9-62768328e091]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.771 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4c041141-a1 in ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.775 208756 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4c041141-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.776 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8df833-570c-45d1-a9ce-0ce7f1a2fc53]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.777 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[20399961-9d3d-4895-970c-3d54bc5e31d9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.795 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[33a1afa9-33e4-49e7-809c-884aeea7d9c3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 systemd-machined[153494]: New machine qemu-12-instance-0000000f.
Nov 25 19:22:48 compute-0 systemd-udevd[214938]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:22:48 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000f.
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.816 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[2ab1ce93-014c-4801-a655-3aba201a4e2c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 NetworkManager[55552]: <info>  [1764098568.8228] device (tapf8f39500-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:22:48 compute-0 NetworkManager[55552]: <info>  [1764098568.8247] device (tapf8f39500-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.866 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[9d27bddc-bb2b-43ff-940e-d7454d5da163]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.873 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[9421486f-323a-4904-98aa-0d8f025513f6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 NetworkManager[55552]: <info>  [1764098568.8755] manager: (tap4c041141-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.925 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[a69eba38-60ec-467d-b6a4-d73c4bc1690a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.931 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f2f4bf-d041-42e2-983d-01e659703168]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 NetworkManager[55552]: <info>  [1764098568.9680] device (tap4c041141-a0): carrier: link connected
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.978 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[80c44154-31d9-4f20-b5a0-eeb5439eefa0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:48.994 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.996 187216 DEBUG nova.compute.manager [req-c8b1835e-a317-4d09-9e62-9a348fc3c3b0 req-07e02ba7-c5e0-496d-ad76-b8aee6ad93aa 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Received event network-vif-plugged-f8f39500-5b73-4257-af08-30ed674e5d0c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.997 187216 DEBUG oslo_concurrency.lockutils [req-c8b1835e-a317-4d09-9e62-9a348fc3c3b0 req-07e02ba7-c5e0-496d-ad76-b8aee6ad93aa 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.998 187216 DEBUG oslo_concurrency.lockutils [req-c8b1835e-a317-4d09-9e62-9a348fc3c3b0 req-07e02ba7-c5e0-496d-ad76-b8aee6ad93aa 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.998 187216 DEBUG oslo_concurrency.lockutils [req-c8b1835e-a317-4d09-9e62-9a348fc3c3b0 req-07e02ba7-c5e0-496d-ad76-b8aee6ad93aa 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:48 compute-0 nova_compute[187212]: 2025-11-25 19:22:48.999 187216 DEBUG nova.compute.manager [req-c8b1835e-a317-4d09-9e62-9a348fc3c3b0 req-07e02ba7-c5e0-496d-ad76-b8aee6ad93aa 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Processing event network-vif-plugged-f8f39500-5b73-4257-af08-30ed674e5d0c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.004 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[fc61dff6-108b-4c5b-9e11-36938c72e0ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c041141-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:23:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451663, 'reachable_time': 43987, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214969, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.009 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.028 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[8425a22f-6966-4aed-8b09-8c47785f2150]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe98:2399'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451663, 'tstamp': 451663}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214970, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.053 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[b64fd6df-9757-40b0-a33b-f14be2086d44]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c041141-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:23:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451663, 'reachable_time': 43987, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214971, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.106 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[899d7096-c82a-4bf8-ac16-a196aa7a9aee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.198 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc5054a-a1c9-44a6-9649-034c57671c90]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.200 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c041141-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.200 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.201 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c041141-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.203 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:49 compute-0 kernel: tap4c041141-a0: entered promiscuous mode
Nov 25 19:22:49 compute-0 NetworkManager[55552]: <info>  [1764098569.2043] manager: (tap4c041141-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.207 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4c041141-a0, col_values=(('external_ids', {'iface-id': '9941ceeb-16f5-4a0e-8227-c1de720c5499'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:49 compute-0 ovn_controller[95465]: 2025-11-25T19:22:49Z|00136|binding|INFO|Releasing lport 9941ceeb-16f5-4a0e-8227-c1de720c5499 from this chassis (sb_readonly=0)
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.208 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.232 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.234 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[9e641ef5-3ee8-4d3b-b0a6-25934d69af26]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.235 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.236 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.236 104356 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 4c041141-ab86-4697-993b-67edbc4f2488 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.236 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.237 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[8f4465be-58ba-4fa8-a8b3-28499fe42784]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.237 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.238 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[574dcf3b-d654-4693-9516-4ef1ab0ad8ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.238 104356 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: global
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     log         /dev/log local0 debug
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     log-tag     haproxy-metadata-proxy-4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     user        root
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     group       root
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     maxconn     1024
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     pidfile     /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     daemon
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: defaults
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     log global
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     mode http
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     option httplog
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     option dontlognull
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     option http-server-close
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     option forwardfor
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     retries                 3
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     timeout http-request    30s
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     timeout connect         30s
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     timeout client          32s
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     timeout server          32s
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     timeout http-keep-alive 30s
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: listen listener
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     bind 169.254.169.254:80
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:     http-request add-header X-OVN-Network-ID 4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.239 104356 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'env', 'PROCESS_TAG=haproxy-4c041141-ab86-4697-993b-67edbc4f2488', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4c041141-ab86-4697-993b-67edbc4f2488.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.343 187216 DEBUG nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.349 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.353 187216 INFO nova.virt.libvirt.driver [-] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Instance spawned successfully.
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.354 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:22:49 compute-0 podman[215010]: 2025-11-25 19:22:49.696390068 +0000 UTC m=+0.072710454 container create 61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Nov 25 19:22:49 compute-0 systemd[1]: Started libpod-conmon-61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038.scope.
Nov 25 19:22:49 compute-0 podman[215010]: 2025-11-25 19:22:49.662563024 +0000 UTC m=+0.038883440 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 19:22:49 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:22:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/775fa6738831b640fe63e0a05cde174aa588da00dccd9a4af726199252410356/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 19:22:49 compute-0 podman[215010]: 2025-11-25 19:22:49.80796303 +0000 UTC m=+0.184283436 container init 61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:22:49 compute-0 podman[215010]: 2025-11-25 19:22:49.817685557 +0000 UTC m=+0.194005943 container start 61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=watcher_latest)
Nov 25 19:22:49 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[215025]: [NOTICE]   (215029) : New worker (215031) forked
Nov 25 19:22:49 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[215025]: [NOTICE]   (215029) : Loading success.
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.869 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.870 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.870 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.870 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.871 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:22:49 compute-0 nova_compute[187212]: 2025-11-25 19:22:49.871 187216 DEBUG nova.virt.libvirt.driver [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:22:49 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:49.920 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:22:50 compute-0 nova_compute[187212]: 2025-11-25 19:22:50.383 187216 INFO nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Took 9.26 seconds to spawn the instance on the hypervisor.
Nov 25 19:22:50 compute-0 nova_compute[187212]: 2025-11-25 19:22:50.385 187216 DEBUG nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:22:50 compute-0 nova_compute[187212]: 2025-11-25 19:22:50.886 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:50 compute-0 nova_compute[187212]: 2025-11-25 19:22:50.927 187216 INFO nova.compute.manager [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Took 14.50 seconds to build instance.
Nov 25 19:22:51 compute-0 nova_compute[187212]: 2025-11-25 19:22:51.083 187216 DEBUG nova.compute.manager [req-bfee29fe-7809-4360-b230-cbf399cb32f6 req-1b6f0ca1-b688-420c-948d-3f254d90a51c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Received event network-vif-plugged-f8f39500-5b73-4257-af08-30ed674e5d0c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:22:51 compute-0 nova_compute[187212]: 2025-11-25 19:22:51.083 187216 DEBUG oslo_concurrency.lockutils [req-bfee29fe-7809-4360-b230-cbf399cb32f6 req-1b6f0ca1-b688-420c-948d-3f254d90a51c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:22:51 compute-0 nova_compute[187212]: 2025-11-25 19:22:51.084 187216 DEBUG oslo_concurrency.lockutils [req-bfee29fe-7809-4360-b230-cbf399cb32f6 req-1b6f0ca1-b688-420c-948d-3f254d90a51c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:22:51 compute-0 nova_compute[187212]: 2025-11-25 19:22:51.084 187216 DEBUG oslo_concurrency.lockutils [req-bfee29fe-7809-4360-b230-cbf399cb32f6 req-1b6f0ca1-b688-420c-948d-3f254d90a51c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:51 compute-0 nova_compute[187212]: 2025-11-25 19:22:51.084 187216 DEBUG nova.compute.manager [req-bfee29fe-7809-4360-b230-cbf399cb32f6 req-1b6f0ca1-b688-420c-948d-3f254d90a51c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] No waiting events found dispatching network-vif-plugged-f8f39500-5b73-4257-af08-30ed674e5d0c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:22:51 compute-0 nova_compute[187212]: 2025-11-25 19:22:51.085 187216 WARNING nova.compute.manager [req-bfee29fe-7809-4360-b230-cbf399cb32f6 req-1b6f0ca1-b688-420c-948d-3f254d90a51c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Received unexpected event network-vif-plugged-f8f39500-5b73-4257-af08-30ed674e5d0c for instance with vm_state active and task_state None.
Nov 25 19:22:51 compute-0 nova_compute[187212]: 2025-11-25 19:22:51.435 187216 DEBUG oslo_concurrency.lockutils [None req-ddaf463a-bcd9-43a9-a47c-dffee53f1141 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.024s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:22:53 compute-0 nova_compute[187212]: 2025-11-25 19:22:53.020 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:54 compute-0 podman[215041]: 2025-11-25 19:22:54.175077431 +0000 UTC m=+0.094907922 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 25 19:22:55 compute-0 nova_compute[187212]: 2025-11-25 19:22:55.890 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:57 compute-0 podman[215063]: 2025-11-25 19:22:57.193925736 +0000 UTC m=+0.114204742 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:22:58 compute-0 nova_compute[187212]: 2025-11-25 19:22:58.022 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:22:58 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:22:58.921 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:22:59 compute-0 podman[197585]: time="2025-11-25T19:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:22:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:22:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3080 "" "Go-http-client/1.1"
Nov 25 19:23:00 compute-0 nova_compute[187212]: 2025-11-25 19:23:00.922 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:01 compute-0 openstack_network_exporter[199731]: ERROR   19:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:23:01 compute-0 openstack_network_exporter[199731]: ERROR   19:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:23:01 compute-0 openstack_network_exporter[199731]: ERROR   19:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:23:01 compute-0 openstack_network_exporter[199731]: ERROR   19:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:23:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:23:01 compute-0 openstack_network_exporter[199731]: ERROR   19:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:23:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:23:02 compute-0 ovn_controller[95465]: 2025-11-25T19:23:02Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:b1:fe 10.100.0.6
Nov 25 19:23:02 compute-0 ovn_controller[95465]: 2025-11-25T19:23:02Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:b1:fe 10.100.0.6
Nov 25 19:23:03 compute-0 nova_compute[187212]: 2025-11-25 19:23:03.025 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:05 compute-0 nova_compute[187212]: 2025-11-25 19:23:05.096 187216 DEBUG nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Creating tmpfile /var/lib/nova/instances/tmpu0du6uf1 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Nov 25 19:23:05 compute-0 nova_compute[187212]: 2025-11-25 19:23:05.099 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:05 compute-0 nova_compute[187212]: 2025-11-25 19:23:05.111 187216 DEBUG nova.compute.manager [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu0du6uf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Nov 25 19:23:05 compute-0 nova_compute[187212]: 2025-11-25 19:23:05.925 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:07 compute-0 nova_compute[187212]: 2025-11-25 19:23:07.162 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:08 compute-0 nova_compute[187212]: 2025-11-25 19:23:08.028 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:09 compute-0 podman[215095]: 2025-11-25 19:23:09.172447696 +0000 UTC m=+0.087369942 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:23:10 compute-0 nova_compute[187212]: 2025-11-25 19:23:10.951 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:11 compute-0 nova_compute[187212]: 2025-11-25 19:23:11.542 187216 DEBUG nova.compute.manager [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu0du6uf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8b0b1d81-9beb-4f93-9171-1f2f5905362d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Nov 25 19:23:12 compute-0 nova_compute[187212]: 2025-11-25 19:23:12.558 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-8b0b1d81-9beb-4f93-9171-1f2f5905362d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:23:12 compute-0 nova_compute[187212]: 2025-11-25 19:23:12.559 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-8b0b1d81-9beb-4f93-9171-1f2f5905362d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:23:12 compute-0 nova_compute[187212]: 2025-11-25 19:23:12.559 187216 DEBUG nova.network.neutron [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:23:13 compute-0 nova_compute[187212]: 2025-11-25 19:23:13.031 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:13 compute-0 nova_compute[187212]: 2025-11-25 19:23:13.068 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:13 compute-0 nova_compute[187212]: 2025-11-25 19:23:13.988 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:14 compute-0 nova_compute[187212]: 2025-11-25 19:23:14.477 187216 DEBUG nova.network.neutron [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Updating instance_info_cache with network_info: [{"id": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "address": "fa:16:3e:52:f6:d8", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4c50bf8-30", "ovs_interfaceid": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:23:14 compute-0 nova_compute[187212]: 2025-11-25 19:23:14.985 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-8b0b1d81-9beb-4f93-9171-1f2f5905362d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.009 187216 DEBUG nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu0du6uf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8b0b1d81-9beb-4f93-9171-1f2f5905362d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.010 187216 DEBUG nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Creating instance directory: /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.011 187216 DEBUG nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Creating disk.info with the contents: {'/var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk': 'qcow2', '/var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.012 187216 DEBUG nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.013 187216 DEBUG nova.objects.instance [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8b0b1d81-9beb-4f93-9171-1f2f5905362d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.521 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.529 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.531 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.626 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.627 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.628 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.629 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.636 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.637 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.723 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.725 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.767 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.768 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.768 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.853 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.855 187216 DEBUG nova.virt.disk.api [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Checking if we can resize image /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.856 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.917 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.918 187216 DEBUG nova.virt.disk.api [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Cannot resize image /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.919 187216 DEBUG nova.objects.instance [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'migration_context' on Instance uuid 8b0b1d81-9beb-4f93-9171-1f2f5905362d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:23:15 compute-0 nova_compute[187212]: 2025-11-25 19:23:15.955 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:16 compute-0 podman[215136]: 2025-11-25 19:23:16.253350223 +0000 UTC m=+0.168102927 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.429 187216 DEBUG nova.objects.base [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Object Instance<8b0b1d81-9beb-4f93-9171-1f2f5905362d> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.430 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.456 187216 DEBUG oslo_concurrency.processutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk.config 497664" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.457 187216 DEBUG nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.458 187216 DEBUG nova.virt.libvirt.vif [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-11-25T19:22:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-262510598',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-262510598',id=14,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:22:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-l00a87a7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:22:31Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=8b0b1d81-9beb-4f93-9171-1f2f5905362d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "address": "fa:16:3e:52:f6:d8", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd4c50bf8-30", "ovs_interfaceid": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.458 187216 DEBUG nova.network.os_vif_util [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "address": "fa:16:3e:52:f6:d8", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd4c50bf8-30", "ovs_interfaceid": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.459 187216 DEBUG nova.network.os_vif_util [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:f6:d8,bridge_name='br-int',has_traffic_filtering=True,id=d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4c50bf8-30') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.460 187216 DEBUG os_vif [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:f6:d8,bridge_name='br-int',has_traffic_filtering=True,id=d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4c50bf8-30') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.460 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.461 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.461 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.462 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.462 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'dab5f6cd-2140-5fd8-9751-ecfb4d450da3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.464 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.466 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.469 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.469 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4c50bf8-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.470 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd4c50bf8-30, col_values=(('qos', UUID('21caa074-baeb-43d4-820e-2c82d400811f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.470 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd4c50bf8-30, col_values=(('external_ids', {'iface-id': 'd4c50bf8-30cb-4dec-8579-4ceb3d6af4c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:f6:d8', 'vm-uuid': '8b0b1d81-9beb-4f93-9171-1f2f5905362d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.471 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:16 compute-0 NetworkManager[55552]: <info>  [1764098596.4726] manager: (tapd4c50bf8-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.474 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.481 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.482 187216 INFO os_vif [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:f6:d8,bridge_name='br-int',has_traffic_filtering=True,id=d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4c50bf8-30')
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.483 187216 DEBUG nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.483 187216 DEBUG nova.compute.manager [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu0du6uf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8b0b1d81-9beb-4f93-9171-1f2f5905362d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.484 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:16 compute-0 nova_compute[187212]: 2025-11-25 19:23:16.646 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:17 compute-0 nova_compute[187212]: 2025-11-25 19:23:17.602 187216 DEBUG nova.network.neutron [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Port d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Nov 25 19:23:17 compute-0 nova_compute[187212]: 2025-11-25 19:23:17.617 187216 DEBUG nova.compute.manager [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu0du6uf1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8b0b1d81-9beb-4f93-9171-1f2f5905362d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Nov 25 19:23:18 compute-0 nova_compute[187212]: 2025-11-25 19:23:18.033 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:18 compute-0 ovn_controller[95465]: 2025-11-25T19:23:18Z|00137|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 19:23:19 compute-0 podman[215167]: 2025-11-25 19:23:19.16977124 +0000 UTC m=+0.088708008 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 25 19:23:20 compute-0 kernel: tapd4c50bf8-30: entered promiscuous mode
Nov 25 19:23:20 compute-0 NetworkManager[55552]: <info>  [1764098600.7883] manager: (tapd4c50bf8-30): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Nov 25 19:23:20 compute-0 ovn_controller[95465]: 2025-11-25T19:23:20Z|00138|binding|INFO|Claiming lport d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 for this additional chassis.
Nov 25 19:23:20 compute-0 ovn_controller[95465]: 2025-11-25T19:23:20Z|00139|binding|INFO|d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4: Claiming fa:16:3e:52:f6:d8 10.100.0.8
Nov 25 19:23:20 compute-0 nova_compute[187212]: 2025-11-25 19:23:20.790 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.806 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:f6:d8 10.100.0.8'], port_security=['fa:16:3e:52:f6:d8 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8b0b1d81-9beb-4f93-9171-1f2f5905362d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'cd093ae9-737a-4a69-9f47-f2a7c74a9952', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa6a1d8f-d84f-49e7-84e1-a927297c44e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.808 104356 INFO neutron.agent.ovn.metadata.agent [-] Port d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 in datapath 4c041141-ab86-4697-993b-67edbc4f2488 unbound from our chassis
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.811 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:23:20 compute-0 ovn_controller[95465]: 2025-11-25T19:23:20Z|00140|binding|INFO|Setting lport d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 ovn-installed in OVS
Nov 25 19:23:20 compute-0 nova_compute[187212]: 2025-11-25 19:23:20.819 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.833 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[94653dc9-87b3-4634-be41-c2d7678c46dd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:20 compute-0 systemd-machined[153494]: New machine qemu-13-instance-0000000e.
Nov 25 19:23:20 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000e.
Nov 25 19:23:20 compute-0 systemd-udevd[215206]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.875 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[d7cd3fc6-e9e4-4fbe-af1e-1d167771c0d7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.879 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[cc85afd0-6dbd-4105-a9a3-aaeb58eda8dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:20 compute-0 NetworkManager[55552]: <info>  [1764098600.8865] device (tapd4c50bf8-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:23:20 compute-0 NetworkManager[55552]: <info>  [1764098600.8882] device (tapd4c50bf8-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.921 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[50769bb1-68a4-4aae-90d4-16c8c72e9e7f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.947 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ef134e5a-c2cb-406c-ba40-bef88bb52e33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c041141-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:23:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451663, 'reachable_time': 43987, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215216, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.970 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[f42f5ab0-524f-4a9e-9f91-6df9c639ff94]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4c041141-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451681, 'tstamp': 451681}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215218, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4c041141-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451686, 'tstamp': 451686}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215218, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.972 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c041141-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:20 compute-0 nova_compute[187212]: 2025-11-25 19:23:20.975 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:20 compute-0 nova_compute[187212]: 2025-11-25 19:23:20.976 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.977 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c041141-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.977 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.978 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4c041141-a0, col_values=(('external_ids', {'iface-id': '9941ceeb-16f5-4a0e-8227-c1de720c5499'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.979 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:23:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:20.980 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[a1589c2c-c05a-46e3-9dc1-7eb94c6ac9e9]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-4c041141-ab86-4697-993b-67edbc4f2488\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 4c041141-ab86-4697-993b-67edbc4f2488\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:21 compute-0 nova_compute[187212]: 2025-11-25 19:23:21.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:23:21 compute-0 nova_compute[187212]: 2025-11-25 19:23:21.175 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:23:21 compute-0 nova_compute[187212]: 2025-11-25 19:23:21.472 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:23 compute-0 nova_compute[187212]: 2025-11-25 19:23:23.036 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:23 compute-0 ovn_controller[95465]: 2025-11-25T19:23:23Z|00141|binding|INFO|Claiming lport d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 for this chassis.
Nov 25 19:23:23 compute-0 ovn_controller[95465]: 2025-11-25T19:23:23Z|00142|binding|INFO|d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4: Claiming fa:16:3e:52:f6:d8 10.100.0.8
Nov 25 19:23:23 compute-0 ovn_controller[95465]: 2025-11-25T19:23:23Z|00143|binding|INFO|Setting lport d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 up in Southbound
Nov 25 19:23:25 compute-0 podman[215238]: 2025-11-25 19:23:25.198607415 +0000 UTC m=+0.112753773 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 19:23:25 compute-0 nova_compute[187212]: 2025-11-25 19:23:25.748 187216 INFO nova.compute.manager [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Post operation of migration started
Nov 25 19:23:25 compute-0 nova_compute[187212]: 2025-11-25 19:23:25.749 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:26 compute-0 nova_compute[187212]: 2025-11-25 19:23:26.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:23:26 compute-0 nova_compute[187212]: 2025-11-25 19:23:26.194 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:26 compute-0 nova_compute[187212]: 2025-11-25 19:23:26.194 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:26 compute-0 nova_compute[187212]: 2025-11-25 19:23:26.267 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-8b0b1d81-9beb-4f93-9171-1f2f5905362d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:23:26 compute-0 nova_compute[187212]: 2025-11-25 19:23:26.267 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-8b0b1d81-9beb-4f93-9171-1f2f5905362d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:23:26 compute-0 nova_compute[187212]: 2025-11-25 19:23:26.267 187216 DEBUG nova.network.neutron [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:23:26 compute-0 nova_compute[187212]: 2025-11-25 19:23:26.517 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:26 compute-0 nova_compute[187212]: 2025-11-25 19:23:26.774 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:27 compute-0 sshd-session[215260]: Connection closed by 36.50.54.94 port 41538
Nov 25 19:23:27 compute-0 nova_compute[187212]: 2025-11-25 19:23:27.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:23:27 compute-0 nova_compute[187212]: 2025-11-25 19:23:27.788 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:28 compute-0 nova_compute[187212]: 2025-11-25 19:23:28.072 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:28 compute-0 nova_compute[187212]: 2025-11-25 19:23:28.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:23:28 compute-0 podman[215261]: 2025-11-25 19:23:28.210744753 +0000 UTC m=+0.097414967 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125)
Nov 25 19:23:28 compute-0 nova_compute[187212]: 2025-11-25 19:23:28.540 187216 DEBUG nova.network.neutron [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Updating instance_info_cache with network_info: [{"id": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "address": "fa:16:3e:52:f6:d8", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4c50bf8-30", "ovs_interfaceid": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:23:28 compute-0 nova_compute[187212]: 2025-11-25 19:23:28.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:28 compute-0 nova_compute[187212]: 2025-11-25 19:23:28.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:28 compute-0 nova_compute[187212]: 2025-11-25 19:23:28.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:28 compute-0 nova_compute[187212]: 2025-11-25 19:23:28.690 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.049 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-8b0b1d81-9beb-4f93-9171-1f2f5905362d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.571 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.571 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.572 187216 DEBUG oslo_concurrency.lockutils [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.578 187216 INFO nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 25 19:23:29 compute-0 virtqemud[186888]: Domain id=13 name='instance-0000000e' uuid=8b0b1d81-9beb-4f93-9171-1f2f5905362d is tainted: custom-monitor
Nov 25 19:23:29 compute-0 podman[197585]: time="2025-11-25T19:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:23:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.752 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3084 "" "Go-http-client/1.1"
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.834 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.836 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.922 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:29 compute-0 nova_compute[187212]: 2025-11-25 19:23:29.931 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.024 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.025 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.130 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d/disk --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.388 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.391 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.427 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.429 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5505MB free_disk=72.934814453125GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.429 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.430 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:30 compute-0 nova_compute[187212]: 2025-11-25 19:23:30.588 187216 INFO nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 25 19:23:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:31.101 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:31.102 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:31.103 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:31 compute-0 openstack_network_exporter[199731]: ERROR   19:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:23:31 compute-0 openstack_network_exporter[199731]: ERROR   19:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:23:31 compute-0 openstack_network_exporter[199731]: ERROR   19:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:23:31 compute-0 openstack_network_exporter[199731]: ERROR   19:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:23:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:23:31 compute-0 openstack_network_exporter[199731]: ERROR   19:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:23:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:23:31 compute-0 nova_compute[187212]: 2025-11-25 19:23:31.453 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Migration for instance 8b0b1d81-9beb-4f93-9171-1f2f5905362d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Nov 25 19:23:31 compute-0 nova_compute[187212]: 2025-11-25 19:23:31.520 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:31 compute-0 nova_compute[187212]: 2025-11-25 19:23:31.594 187216 INFO nova.virt.libvirt.driver [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 25 19:23:31 compute-0 nova_compute[187212]: 2025-11-25 19:23:31.600 187216 DEBUG nova.compute.manager [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:23:31 compute-0 nova_compute[187212]: 2025-11-25 19:23:31.963 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Updating resource usage from migration e68878dc-813e-44b3-bdc7-54b41117b5df
Nov 25 19:23:31 compute-0 nova_compute[187212]: 2025-11-25 19:23:31.964 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Starting to track incoming migration e68878dc-813e-44b3-bdc7-54b41117b5df with flavor d7d5bae9-10ca-4750-9d69-ce73a869da56 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Nov 25 19:23:32 compute-0 nova_compute[187212]: 2025-11-25 19:23:32.112 187216 DEBUG nova.objects.instance [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Nov 25 19:23:32 compute-0 nova_compute[187212]: 2025-11-25 19:23:32.507 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 1140b061-ca3b-44fb-9523-49b86ac5c5e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:23:33 compute-0 nova_compute[187212]: 2025-11-25 19:23:33.017 187216 WARNING nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 8b0b1d81-9beb-4f93-9171-1f2f5905362d has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Nov 25 19:23:33 compute-0 nova_compute[187212]: 2025-11-25 19:23:33.018 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:23:33 compute-0 nova_compute[187212]: 2025-11-25 19:23:33.018 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:23:30 up  1:15,  0 user,  load average: 0.37, 0.40, 0.45\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_01a0280ccebb48a888956426fb3d2015': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:23:33 compute-0 nova_compute[187212]: 2025-11-25 19:23:33.075 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:33 compute-0 nova_compute[187212]: 2025-11-25 19:23:33.085 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:23:33 compute-0 nova_compute[187212]: 2025-11-25 19:23:33.137 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:33 compute-0 nova_compute[187212]: 2025-11-25 19:23:33.399 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:33 compute-0 nova_compute[187212]: 2025-11-25 19:23:33.400 187216 WARNING neutronclient.v2_0.client [None req-4e136dee-c44c-4506-917a-3e34f1f31ca2 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:33 compute-0 nova_compute[187212]: 2025-11-25 19:23:33.593 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:23:34 compute-0 nova_compute[187212]: 2025-11-25 19:23:34.106 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:23:34 compute-0 nova_compute[187212]: 2025-11-25 19:23:34.106 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.676s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:35 compute-0 nova_compute[187212]: 2025-11-25 19:23:35.982 187216 DEBUG oslo_concurrency.lockutils [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:35 compute-0 nova_compute[187212]: 2025-11-25 19:23:35.983 187216 DEBUG oslo_concurrency.lockutils [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:35 compute-0 nova_compute[187212]: 2025-11-25 19:23:35.983 187216 DEBUG oslo_concurrency.lockutils [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:35 compute-0 nova_compute[187212]: 2025-11-25 19:23:35.984 187216 DEBUG oslo_concurrency.lockutils [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:35 compute-0 nova_compute[187212]: 2025-11-25 19:23:35.984 187216 DEBUG oslo_concurrency.lockutils [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.003 187216 INFO nova.compute.manager [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Terminating instance
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.107 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.108 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.523 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.525 187216 DEBUG nova.compute.manager [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:23:36 compute-0 kernel: tapf8f39500-5b (unregistering): left promiscuous mode
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.623 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.624 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.624 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:23:36 compute-0 NetworkManager[55552]: <info>  [1764098616.6274] device (tapf8f39500-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.639 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:36 compute-0 ovn_controller[95465]: 2025-11-25T19:23:36Z|00144|binding|INFO|Releasing lport f8f39500-5b73-4257-af08-30ed674e5d0c from this chassis (sb_readonly=0)
Nov 25 19:23:36 compute-0 ovn_controller[95465]: 2025-11-25T19:23:36Z|00145|binding|INFO|Setting lport f8f39500-5b73-4257-af08-30ed674e5d0c down in Southbound
Nov 25 19:23:36 compute-0 ovn_controller[95465]: 2025-11-25T19:23:36Z|00146|binding|INFO|Removing iface tapf8f39500-5b ovn-installed in OVS
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.643 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.657 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.663 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:b1:fe 10.100.0.6'], port_security=['fa:16:3e:19:b1:fe 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1140b061-ca3b-44fb-9523-49b86ac5c5e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'cd093ae9-737a-4a69-9f47-f2a7c74a9952', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa6a1d8f-d84f-49e7-84e1-a927297c44e2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=f8f39500-5b73-4257-af08-30ed674e5d0c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.664 104356 INFO neutron.agent.ovn.metadata.agent [-] Port f8f39500-5b73-4257-af08-30ed674e5d0c in datapath 4c041141-ab86-4697-993b-67edbc4f2488 unbound from our chassis
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.667 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4c041141-ab86-4697-993b-67edbc4f2488
Nov 25 19:23:36 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.695 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[0db0ac9a-47c2-4893-982b-06bbca3b48db]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:36 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000f.scope: Consumed 14.666s CPU time.
Nov 25 19:23:36 compute-0 systemd-machined[153494]: Machine qemu-12-instance-0000000f terminated.
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.736 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[0c06bb51-f5c7-4b45-b5d7-f57c83e43e2d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.740 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[d5e48383-3be3-4b2e-bafd-7a49c0d21da5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.783 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[0d849313-b9be-4d07-b475-f17e6f882207]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.811 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[4e17435f-164b-466d-8da0-e76305f4db80]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c041141-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:98:23:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451663, 'reachable_time': 43987, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215325, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.812 187216 INFO nova.virt.libvirt.driver [-] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Instance destroyed successfully.
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.813 187216 DEBUG nova.objects.instance [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lazy-loading 'resources' on Instance uuid 1140b061-ca3b-44fb-9523-49b86ac5c5e8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.836 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[95aeda94-d3e1-44ba-a5d8-811bc0e9c429]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4c041141-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451681, 'tstamp': 451681}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215328, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4c041141-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451686, 'tstamp': 451686}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215328, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.838 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c041141-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.840 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.846 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.847 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c041141-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.847 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.847 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4c041141-a0, col_values=(('external_ids', {'iface-id': '9941ceeb-16f5-4a0e-8227-c1de720c5499'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.848 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:23:36 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:36.849 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[7f9b46a9-8c05-4077-b1fe-a705f221ee80]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-4c041141-ab86-4697-993b-67edbc4f2488\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 4c041141-ab86-4697-993b-67edbc4f2488\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.897 187216 DEBUG nova.compute.manager [req-74fbec3d-aab3-4501-8277-88a7a9902ccb req-469ee926-6566-4b70-8d8e-1983c7c978f2 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Received event network-vif-unplugged-f8f39500-5b73-4257-af08-30ed674e5d0c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.897 187216 DEBUG oslo_concurrency.lockutils [req-74fbec3d-aab3-4501-8277-88a7a9902ccb req-469ee926-6566-4b70-8d8e-1983c7c978f2 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.897 187216 DEBUG oslo_concurrency.lockutils [req-74fbec3d-aab3-4501-8277-88a7a9902ccb req-469ee926-6566-4b70-8d8e-1983c7c978f2 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.898 187216 DEBUG oslo_concurrency.lockutils [req-74fbec3d-aab3-4501-8277-88a7a9902ccb req-469ee926-6566-4b70-8d8e-1983c7c978f2 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.898 187216 DEBUG nova.compute.manager [req-74fbec3d-aab3-4501-8277-88a7a9902ccb req-469ee926-6566-4b70-8d8e-1983c7c978f2 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] No waiting events found dispatching network-vif-unplugged-f8f39500-5b73-4257-af08-30ed674e5d0c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:23:36 compute-0 nova_compute[187212]: 2025-11-25 19:23:36.898 187216 DEBUG nova.compute.manager [req-74fbec3d-aab3-4501-8277-88a7a9902ccb req-469ee926-6566-4b70-8d8e-1983c7c978f2 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Received event network-vif-unplugged-f8f39500-5b73-4257-af08-30ed674e5d0c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.321 187216 DEBUG nova.virt.libvirt.vif [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:22:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-503570234',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-503570234',id=15,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:22:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-40hf05ys',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:22:50Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=1140b061-ca3b-44fb-9523-49b86ac5c5e8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.322 187216 DEBUG nova.network.os_vif_util [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converting VIF {"id": "f8f39500-5b73-4257-af08-30ed674e5d0c", "address": "fa:16:3e:19:b1:fe", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8f39500-5b", "ovs_interfaceid": "f8f39500-5b73-4257-af08-30ed674e5d0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.322 187216 DEBUG nova.network.os_vif_util [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:b1:fe,bridge_name='br-int',has_traffic_filtering=True,id=f8f39500-5b73-4257-af08-30ed674e5d0c,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8f39500-5b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.322 187216 DEBUG os_vif [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:b1:fe,bridge_name='br-int',has_traffic_filtering=True,id=f8f39500-5b73-4257-af08-30ed674e5d0c,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8f39500-5b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.324 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.324 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8f39500-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.325 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.328 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.328 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.328 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.328 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=dd90d0ca-be6e-406e-bc90-f2993ea2f75a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.329 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.331 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.333 187216 INFO os_vif [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:b1:fe,bridge_name='br-int',has_traffic_filtering=True,id=f8f39500-5b73-4257-af08-30ed674e5d0c,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8f39500-5b')
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.333 187216 INFO nova.virt.libvirt.driver [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Deleting instance files /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8_del
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.334 187216 INFO nova.virt.libvirt.driver [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Deletion of /var/lib/nova/instances/1140b061-ca3b-44fb-9523-49b86ac5c5e8_del complete
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.850 187216 INFO nova.compute.manager [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Took 1.33 seconds to destroy the instance on the hypervisor.
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.851 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.852 187216 DEBUG nova.compute.manager [-] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.852 187216 DEBUG nova.network.neutron [-] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:23:37 compute-0 nova_compute[187212]: 2025-11-25 19:23:37.853 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:38 compute-0 nova_compute[187212]: 2025-11-25 19:23:38.078 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:38 compute-0 nova_compute[187212]: 2025-11-25 19:23:38.535 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:38 compute-0 nova_compute[187212]: 2025-11-25 19:23:38.973 187216 DEBUG nova.compute.manager [req-0ce95f6a-e3c3-4ecb-806d-063169365007 req-d843db62-5099-4517-86cc-3ca0eb409695 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Received event network-vif-unplugged-f8f39500-5b73-4257-af08-30ed674e5d0c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:23:38 compute-0 nova_compute[187212]: 2025-11-25 19:23:38.973 187216 DEBUG oslo_concurrency.lockutils [req-0ce95f6a-e3c3-4ecb-806d-063169365007 req-d843db62-5099-4517-86cc-3ca0eb409695 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:38 compute-0 nova_compute[187212]: 2025-11-25 19:23:38.974 187216 DEBUG oslo_concurrency.lockutils [req-0ce95f6a-e3c3-4ecb-806d-063169365007 req-d843db62-5099-4517-86cc-3ca0eb409695 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:38 compute-0 nova_compute[187212]: 2025-11-25 19:23:38.974 187216 DEBUG oslo_concurrency.lockutils [req-0ce95f6a-e3c3-4ecb-806d-063169365007 req-d843db62-5099-4517-86cc-3ca0eb409695 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:38 compute-0 nova_compute[187212]: 2025-11-25 19:23:38.974 187216 DEBUG nova.compute.manager [req-0ce95f6a-e3c3-4ecb-806d-063169365007 req-d843db62-5099-4517-86cc-3ca0eb409695 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] No waiting events found dispatching network-vif-unplugged-f8f39500-5b73-4257-af08-30ed674e5d0c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:23:38 compute-0 nova_compute[187212]: 2025-11-25 19:23:38.975 187216 DEBUG nova.compute.manager [req-0ce95f6a-e3c3-4ecb-806d-063169365007 req-d843db62-5099-4517-86cc-3ca0eb409695 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Received event network-vif-unplugged-f8f39500-5b73-4257-af08-30ed674e5d0c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:23:39 compute-0 nova_compute[187212]: 2025-11-25 19:23:39.303 187216 DEBUG nova.compute.manager [req-61c99211-1a8b-4c3d-937b-c74b7985a946 req-e7282639-6965-46fe-8501-71c81c6add0c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Received event network-vif-deleted-f8f39500-5b73-4257-af08-30ed674e5d0c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:23:39 compute-0 nova_compute[187212]: 2025-11-25 19:23:39.303 187216 INFO nova.compute.manager [req-61c99211-1a8b-4c3d-937b-c74b7985a946 req-e7282639-6965-46fe-8501-71c81c6add0c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Neutron deleted interface f8f39500-5b73-4257-af08-30ed674e5d0c; detaching it from the instance and deleting it from the info cache
Nov 25 19:23:39 compute-0 nova_compute[187212]: 2025-11-25 19:23:39.304 187216 DEBUG nova.network.neutron [req-61c99211-1a8b-4c3d-937b-c74b7985a946 req-e7282639-6965-46fe-8501-71c81c6add0c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:23:39 compute-0 nova_compute[187212]: 2025-11-25 19:23:39.736 187216 DEBUG nova.network.neutron [-] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:23:39 compute-0 nova_compute[187212]: 2025-11-25 19:23:39.821 187216 DEBUG nova.compute.manager [req-61c99211-1a8b-4c3d-937b-c74b7985a946 req-e7282639-6965-46fe-8501-71c81c6add0c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Detach interface failed, port_id=f8f39500-5b73-4257-af08-30ed674e5d0c, reason: Instance 1140b061-ca3b-44fb-9523-49b86ac5c5e8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:23:40 compute-0 podman[215330]: 2025-11-25 19:23:40.17179998 +0000 UTC m=+0.091001178 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:23:40 compute-0 nova_compute[187212]: 2025-11-25 19:23:40.758 187216 INFO nova.compute.manager [-] [instance: 1140b061-ca3b-44fb-9523-49b86ac5c5e8] Took 2.91 seconds to deallocate network for instance.
Nov 25 19:23:42 compute-0 nova_compute[187212]: 2025-11-25 19:23:42.330 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:42 compute-0 nova_compute[187212]: 2025-11-25 19:23:42.685 187216 DEBUG oslo_concurrency.lockutils [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:42 compute-0 nova_compute[187212]: 2025-11-25 19:23:42.687 187216 DEBUG oslo_concurrency.lockutils [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:42 compute-0 nova_compute[187212]: 2025-11-25 19:23:42.790 187216 DEBUG nova.compute.provider_tree [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:23:43 compute-0 nova_compute[187212]: 2025-11-25 19:23:43.080 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:44 compute-0 nova_compute[187212]: 2025-11-25 19:23:44.443 187216 DEBUG nova.scheduler.client.report [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:23:47 compute-0 podman[215355]: 2025-11-25 19:23:47.224963402 +0000 UTC m=+0.142371627 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4)
Nov 25 19:23:47 compute-0 nova_compute[187212]: 2025-11-25 19:23:47.332 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:48 compute-0 nova_compute[187212]: 2025-11-25 19:23:48.108 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:48 compute-0 nova_compute[187212]: 2025-11-25 19:23:48.151 187216 DEBUG oslo_concurrency.lockutils [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 5.465s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:48 compute-0 nova_compute[187212]: 2025-11-25 19:23:48.577 187216 INFO nova.scheduler.client.report [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Deleted allocations for instance 1140b061-ca3b-44fb-9523-49b86ac5c5e8
Nov 25 19:23:50 compute-0 podman[215381]: 2025-11-25 19:23:50.171775152 +0000 UTC m=+0.087866856 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:23:52 compute-0 nova_compute[187212]: 2025-11-25 19:23:52.334 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:52 compute-0 nova_compute[187212]: 2025-11-25 19:23:52.936 187216 DEBUG oslo_concurrency.lockutils [None req-82989bb7-deeb-439c-8413-120612035f3e b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "1140b061-ca3b-44fb-9523-49b86ac5c5e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 16.953s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:53 compute-0 nova_compute[187212]: 2025-11-25 19:23:53.110 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:54 compute-0 nova_compute[187212]: 2025-11-25 19:23:54.939 187216 DEBUG oslo_concurrency.lockutils [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:54 compute-0 nova_compute[187212]: 2025-11-25 19:23:54.940 187216 DEBUG oslo_concurrency.lockutils [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:54 compute-0 nova_compute[187212]: 2025-11-25 19:23:54.940 187216 DEBUG oslo_concurrency.lockutils [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:54 compute-0 nova_compute[187212]: 2025-11-25 19:23:54.940 187216 DEBUG oslo_concurrency.lockutils [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:54 compute-0 nova_compute[187212]: 2025-11-25 19:23:54.941 187216 DEBUG oslo_concurrency.lockutils [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:54 compute-0 nova_compute[187212]: 2025-11-25 19:23:54.953 187216 INFO nova.compute.manager [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Terminating instance
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.473 187216 DEBUG nova.compute.manager [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:23:55 compute-0 kernel: tapd4c50bf8-30 (unregistering): left promiscuous mode
Nov 25 19:23:55 compute-0 NetworkManager[55552]: <info>  [1764098635.5014] device (tapd4c50bf8-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:23:55 compute-0 ovn_controller[95465]: 2025-11-25T19:23:55Z|00147|binding|INFO|Releasing lport d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 from this chassis (sb_readonly=0)
Nov 25 19:23:55 compute-0 ovn_controller[95465]: 2025-11-25T19:23:55Z|00148|binding|INFO|Setting lport d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 down in Southbound
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.538 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:55 compute-0 ovn_controller[95465]: 2025-11-25T19:23:55Z|00149|binding|INFO|Removing iface tapd4c50bf8-30 ovn-installed in OVS
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.541 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.548 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:f6:d8 10.100.0.8'], port_security=['fa:16:3e:52:f6:d8 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8b0b1d81-9beb-4f93-9171-1f2f5905362d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c041141-ab86-4697-993b-67edbc4f2488', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '01a0280ccebb48a888956426fb3d2015', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'cd093ae9-737a-4a69-9f47-f2a7c74a9952', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa6a1d8f-d84f-49e7-84e1-a927297c44e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.549 104356 INFO neutron.agent.ovn.metadata.agent [-] Port d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 in datapath 4c041141-ab86-4697-993b-67edbc4f2488 unbound from our chassis
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.552 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c041141-ab86-4697-993b-67edbc4f2488, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.553 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[eadbca6b-6284-45c1-b638-4c5323636ff4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.554 104356 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 namespace which is not needed anymore
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.564 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:55 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Nov 25 19:23:55 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000e.scope: Consumed 3.287s CPU time.
Nov 25 19:23:55 compute-0 systemd-machined[153494]: Machine qemu-13-instance-0000000e terminated.
Nov 25 19:23:55 compute-0 podman[215402]: 2025-11-25 19:23:55.673949358 +0000 UTC m=+0.091506191 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Nov 25 19:23:55 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[215025]: [NOTICE]   (215029) : haproxy version is 3.0.5-8e879a5
Nov 25 19:23:55 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[215025]: [NOTICE]   (215029) : path to executable is /usr/sbin/haproxy
Nov 25 19:23:55 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[215025]: [WARNING]  (215029) : Exiting Master process...
Nov 25 19:23:55 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[215025]: [ALERT]    (215029) : Current worker (215031) exited with code 143 (Terminated)
Nov 25 19:23:55 compute-0 neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488[215025]: [WARNING]  (215029) : All workers exited. Exiting... (0)
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.741 187216 DEBUG nova.compute.manager [req-69b75900-b07f-4c99-8b9b-a43176b9f90b req-1f8e55ea-44e3-43f0-af60-e3e60c14a476 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Received event network-vif-unplugged-d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.742 187216 DEBUG oslo_concurrency.lockutils [req-69b75900-b07f-4c99-8b9b-a43176b9f90b req-1f8e55ea-44e3-43f0-af60-e3e60c14a476 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.742 187216 DEBUG oslo_concurrency.lockutils [req-69b75900-b07f-4c99-8b9b-a43176b9f90b req-1f8e55ea-44e3-43f0-af60-e3e60c14a476 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.743 187216 DEBUG oslo_concurrency.lockutils [req-69b75900-b07f-4c99-8b9b-a43176b9f90b req-1f8e55ea-44e3-43f0-af60-e3e60c14a476 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.743 187216 DEBUG nova.compute.manager [req-69b75900-b07f-4c99-8b9b-a43176b9f90b req-1f8e55ea-44e3-43f0-af60-e3e60c14a476 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] No waiting events found dispatching network-vif-unplugged-d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:23:55 compute-0 podman[215447]: 2025-11-25 19:23:55.739634396 +0000 UTC m=+0.044960981 container kill 61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:23:55 compute-0 systemd[1]: libpod-61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038.scope: Deactivated successfully.
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.744 187216 DEBUG nova.compute.manager [req-69b75900-b07f-4c99-8b9b-a43176b9f90b req-1f8e55ea-44e3-43f0-af60-e3e60c14a476 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Received event network-vif-unplugged-d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.766 187216 INFO nova.virt.libvirt.driver [-] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Instance destroyed successfully.
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.767 187216 DEBUG nova.objects.instance [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lazy-loading 'resources' on Instance uuid 8b0b1d81-9beb-4f93-9171-1f2f5905362d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:23:55 compute-0 podman[215475]: 2025-11-25 19:23:55.790341537 +0000 UTC m=+0.032097120 container died 61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Nov 25 19:23:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038-userdata-shm.mount: Deactivated successfully.
Nov 25 19:23:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-775fa6738831b640fe63e0a05cde174aa588da00dccd9a4af726199252410356-merged.mount: Deactivated successfully.
Nov 25 19:23:55 compute-0 podman[215475]: 2025-11-25 19:23:55.8395865 +0000 UTC m=+0.081342053 container cleanup 61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Nov 25 19:23:55 compute-0 systemd[1]: libpod-conmon-61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038.scope: Deactivated successfully.
Nov 25 19:23:55 compute-0 podman[215484]: 2025-11-25 19:23:55.858599293 +0000 UTC m=+0.072389316 container remove 61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.865 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[8488748d-81d2-419a-8d23-5423d9b22172]: (4, ("Tue Nov 25 07:23:55 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 (61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038)\n61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038\nTue Nov 25 07:23:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 (61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038)\n61902b29700c49727255894edbb1ef1d76cf1c4118ba11934b2c80997c7f4038\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.867 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[6c9e5e5e-a5f8-4507-a1a8-a2fa8cc28072]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.867 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c041141-ab86-4697-993b-67edbc4f2488.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.868 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c6be7d1d-ad54-4d06-89bc-20f10ecb2bbc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.869 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c041141-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.871 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:55 compute-0 kernel: tap4c041141-a0: left promiscuous mode
Nov 25 19:23:55 compute-0 nova_compute[187212]: 2025-11-25 19:23:55.892 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.899 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[82ffd61d-dc59-42d0-9dbd-42ff2d204d6b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.918 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[6a31b7c3-ed7a-4699-aa89-1ec666b84a2d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.919 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[49bd9707-43a8-4be9-976b-411093bf81b9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.938 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[59b22df3-3041-4e8a-b14e-50083934d810]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451652, 'reachable_time': 35338, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215511, 'error': None, 'target': 'ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.941 104475 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4c041141-ab86-4697-993b-67edbc4f2488 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 19:23:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:23:55.941 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[22dd20dc-8c5a-4385-b38f-b104f91a89c6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:23:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d4c041141\x2dab86\x2d4697\x2d993b\x2d67edbc4f2488.mount: Deactivated successfully.
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.275 187216 DEBUG nova.virt.libvirt.vif [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2025-11-25T19:22:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-262510598',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-262510598',id=14,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:22:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='01a0280ccebb48a888956426fb3d2015',ramdisk_id='',reservation_id='r-l00a87a7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',clean_attempts='1',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1349736763-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:23:32Z,user_data=None,user_id='b86907256ac0401183dd8a2c5394fe31',uuid=8b0b1d81-9beb-4f93-9171-1f2f5905362d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "address": "fa:16:3e:52:f6:d8", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4c50bf8-30", "ovs_interfaceid": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.275 187216 DEBUG nova.network.os_vif_util [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converting VIF {"id": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "address": "fa:16:3e:52:f6:d8", "network": {"id": "4c041141-ab86-4697-993b-67edbc4f2488", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-276288590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff475c9ba4a43f881c2be6a94ae0ff9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4c50bf8-30", "ovs_interfaceid": "d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.276 187216 DEBUG nova.network.os_vif_util [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:f6:d8,bridge_name='br-int',has_traffic_filtering=True,id=d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4c50bf8-30') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.277 187216 DEBUG os_vif [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:f6:d8,bridge_name='br-int',has_traffic_filtering=True,id=d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4c50bf8-30') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.279 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.279 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4c50bf8-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.283 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.284 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.285 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=21caa074-baeb-43d4-820e-2c82d400811f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.286 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.287 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.290 187216 INFO os_vif [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:f6:d8,bridge_name='br-int',has_traffic_filtering=True,id=d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4,network=Network(4c041141-ab86-4697-993b-67edbc4f2488),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4c50bf8-30')
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.291 187216 INFO nova.virt.libvirt.driver [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Deleting instance files /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d_del
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.292 187216 INFO nova.virt.libvirt.driver [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Deletion of /var/lib/nova/instances/8b0b1d81-9beb-4f93-9171-1f2f5905362d_del complete
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.452 187216 INFO nova.compute.manager [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Took 0.98 seconds to destroy the instance on the hypervisor.
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.453 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.453 187216 DEBUG nova.compute.manager [-] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.453 187216 DEBUG nova.network.neutron [-] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.453 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:56 compute-0 nova_compute[187212]: 2025-11-25 19:23:56.591 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:23:57 compute-0 nova_compute[187212]: 2025-11-25 19:23:57.827 187216 DEBUG nova.compute.manager [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Received event network-vif-unplugged-d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:23:57 compute-0 nova_compute[187212]: 2025-11-25 19:23:57.828 187216 DEBUG oslo_concurrency.lockutils [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:57 compute-0 nova_compute[187212]: 2025-11-25 19:23:57.828 187216 DEBUG oslo_concurrency.lockutils [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:57 compute-0 nova_compute[187212]: 2025-11-25 19:23:57.829 187216 DEBUG oslo_concurrency.lockutils [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:57 compute-0 nova_compute[187212]: 2025-11-25 19:23:57.829 187216 DEBUG nova.compute.manager [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] No waiting events found dispatching network-vif-unplugged-d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:23:57 compute-0 nova_compute[187212]: 2025-11-25 19:23:57.830 187216 DEBUG nova.compute.manager [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Received event network-vif-unplugged-d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:23:57 compute-0 nova_compute[187212]: 2025-11-25 19:23:57.830 187216 DEBUG nova.compute.manager [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Received event network-vif-deleted-d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:23:57 compute-0 nova_compute[187212]: 2025-11-25 19:23:57.831 187216 INFO nova.compute.manager [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Neutron deleted interface d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4; detaching it from the instance and deleting it from the info cache
Nov 25 19:23:57 compute-0 nova_compute[187212]: 2025-11-25 19:23:57.831 187216 DEBUG nova.network.neutron [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:23:58 compute-0 nova_compute[187212]: 2025-11-25 19:23:58.159 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:23:58 compute-0 nova_compute[187212]: 2025-11-25 19:23:58.192 187216 DEBUG nova.network.neutron [-] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:23:58 compute-0 nova_compute[187212]: 2025-11-25 19:23:58.341 187216 DEBUG nova.compute.manager [req-4b3ade82-d6a1-4335-8cf3-f326bac03940 req-d54bb3a2-0736-4980-a568-450a4ad6f2b4 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Detach interface failed, port_id=d4c50bf8-30cb-4dec-8579-4ceb3d6af4c4, reason: Instance 8b0b1d81-9beb-4f93-9171-1f2f5905362d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:23:58 compute-0 nova_compute[187212]: 2025-11-25 19:23:58.706 187216 INFO nova.compute.manager [-] [instance: 8b0b1d81-9beb-4f93-9171-1f2f5905362d] Took 2.25 seconds to deallocate network for instance.
Nov 25 19:23:59 compute-0 podman[215513]: 2025-11-25 19:23:59.152123695 +0000 UTC m=+0.079920906 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 19:23:59 compute-0 nova_compute[187212]: 2025-11-25 19:23:59.229 187216 DEBUG oslo_concurrency.lockutils [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:23:59 compute-0 nova_compute[187212]: 2025-11-25 19:23:59.230 187216 DEBUG oslo_concurrency.lockutils [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:23:59 compute-0 nova_compute[187212]: 2025-11-25 19:23:59.239 187216 DEBUG oslo_concurrency.lockutils [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.009s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:23:59 compute-0 nova_compute[187212]: 2025-11-25 19:23:59.281 187216 INFO nova.scheduler.client.report [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Deleted allocations for instance 8b0b1d81-9beb-4f93-9171-1f2f5905362d
Nov 25 19:23:59 compute-0 sshd-session[215533]: error: kex_exchange_identification: read: Connection reset by peer
Nov 25 19:23:59 compute-0 sshd-session[215533]: Connection reset by 45.140.17.97 port 33417
Nov 25 19:23:59 compute-0 podman[197585]: time="2025-11-25T19:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:23:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:23:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2618 "" "Go-http-client/1.1"
Nov 25 19:24:00 compute-0 nova_compute[187212]: 2025-11-25 19:24:00.309 187216 DEBUG oslo_concurrency.lockutils [None req-0b6821e5-80ed-482c-a46a-f860553a8c75 b86907256ac0401183dd8a2c5394fe31 01a0280ccebb48a888956426fb3d2015 - - default default] Lock "8b0b1d81-9beb-4f93-9171-1f2f5905362d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.369s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:24:01 compute-0 nova_compute[187212]: 2025-11-25 19:24:01.287 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:01 compute-0 openstack_network_exporter[199731]: ERROR   19:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:24:01 compute-0 openstack_network_exporter[199731]: ERROR   19:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:24:01 compute-0 openstack_network_exporter[199731]: ERROR   19:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:24:01 compute-0 openstack_network_exporter[199731]: ERROR   19:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:24:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:24:01 compute-0 openstack_network_exporter[199731]: ERROR   19:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:24:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:24:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:02.906 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:24:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:02.907 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:24:02 compute-0 nova_compute[187212]: 2025-11-25 19:24:02.907 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:03 compute-0 nova_compute[187212]: 2025-11-25 19:24:03.197 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:05 compute-0 nova_compute[187212]: 2025-11-25 19:24:05.285 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:05 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:05.909 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:24:06 compute-0 nova_compute[187212]: 2025-11-25 19:24:06.289 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:08 compute-0 nova_compute[187212]: 2025-11-25 19:24:08.245 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:11 compute-0 podman[215535]: 2025-11-25 19:24:11.1566157 +0000 UTC m=+0.079712259 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:24:11 compute-0 nova_compute[187212]: 2025-11-25 19:24:11.290 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:13 compute-0 nova_compute[187212]: 2025-11-25 19:24:13.246 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:16 compute-0 nova_compute[187212]: 2025-11-25 19:24:16.293 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:16 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:16.662 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:44:de 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3c9934abb6540418711f0a3d8d13862', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e071529-0293-4440-9c70-07d9694c0383, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f3db0a73-6d5e-44f5-a754-565ad86befff) old=Port_Binding(mac=['fa:16:3e:dd:44:de'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3c9934abb6540418711f0a3d8d13862', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:24:16 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:16.664 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f3db0a73-6d5e-44f5-a754-565ad86befff in datapath 1d90bb72-93e5-4ff5-baa5-d0e187ade418 updated
Nov 25 19:24:16 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:16.665 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d90bb72-93e5-4ff5-baa5-d0e187ade418, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:24:16 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:16.666 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[9eeac57a-abd6-461a-9029-49cba4d98ec2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:24:18 compute-0 nova_compute[187212]: 2025-11-25 19:24:18.250 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:18 compute-0 podman[215561]: 2025-11-25 19:24:18.258043136 +0000 UTC m=+0.177595164 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 19:24:21 compute-0 podman[215588]: 2025-11-25 19:24:21.150115248 +0000 UTC m=+0.076379773 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:24:21 compute-0 nova_compute[187212]: 2025-11-25 19:24:21.294 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:22 compute-0 nova_compute[187212]: 2025-11-25 19:24:22.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:22 compute-0 nova_compute[187212]: 2025-11-25 19:24:22.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:24:23 compute-0 nova_compute[187212]: 2025-11-25 19:24:23.252 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:24.122 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:4d:de 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3621c2a7-d6c2-4c72-97da-696b727c5db7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3621c2a7-d6c2-4c72-97da-696b727c5db7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e0287f0353d44a63af6cafda5ee0aa0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8c0e7ea-0ad5-4a54-9b78-7a5e84343a81, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4536725e-0fb1-409f-bdd2-72d6db9e95f0) old=Port_Binding(mac=['fa:16:3e:3c:4d:de'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-3621c2a7-d6c2-4c72-97da-696b727c5db7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3621c2a7-d6c2-4c72-97da-696b727c5db7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e0287f0353d44a63af6cafda5ee0aa0c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:24:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:24.124 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4536725e-0fb1-409f-bdd2-72d6db9e95f0 in datapath 3621c2a7-d6c2-4c72-97da-696b727c5db7 updated
Nov 25 19:24:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:24.125 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3621c2a7-d6c2-4c72-97da-696b727c5db7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:24:24 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:24.126 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[00f8c2e2-3cbf-44b3-89b7-e36b2eaada29]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:24:26 compute-0 podman[215608]: 2025-11-25 19:24:26.133873025 +0000 UTC m=+0.062729875 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Nov 25 19:24:26 compute-0 nova_compute[187212]: 2025-11-25 19:24:26.297 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:27 compute-0 nova_compute[187212]: 2025-11-25 19:24:27.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:28 compute-0 nova_compute[187212]: 2025-11-25 19:24:28.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:28 compute-0 nova_compute[187212]: 2025-11-25 19:24:28.281 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.694 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:24:29 compute-0 podman[197585]: time="2025-11-25T19:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:24:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:24:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2620 "" "Go-http-client/1.1"
Nov 25 19:24:29 compute-0 podman[215630]: 2025-11-25 19:24:29.858421813 +0000 UTC m=+0.107276143 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.948 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.949 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.982 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.983 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5833MB free_disk=72.99279403686523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.983 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:24:29 compute-0 nova_compute[187212]: 2025-11-25 19:24:29.984 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:24:31 compute-0 nova_compute[187212]: 2025-11-25 19:24:31.039 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:24:31 compute-0 nova_compute[187212]: 2025-11-25 19:24:31.040 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:24:29 up  1:16,  0 user,  load average: 0.31, 0.37, 0.44\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:24:31 compute-0 nova_compute[187212]: 2025-11-25 19:24:31.062 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:24:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:31.104 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:24:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:31.104 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:24:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:24:31.104 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:24:31 compute-0 nova_compute[187212]: 2025-11-25 19:24:31.300 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:31 compute-0 openstack_network_exporter[199731]: ERROR   19:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:24:31 compute-0 openstack_network_exporter[199731]: ERROR   19:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:24:31 compute-0 openstack_network_exporter[199731]: ERROR   19:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:24:31 compute-0 openstack_network_exporter[199731]: ERROR   19:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:24:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:24:31 compute-0 openstack_network_exporter[199731]: ERROR   19:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:24:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:24:31 compute-0 nova_compute[187212]: 2025-11-25 19:24:31.571 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:24:32 compute-0 nova_compute[187212]: 2025-11-25 19:24:32.089 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:24:32 compute-0 nova_compute[187212]: 2025-11-25 19:24:32.090 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:24:33 compute-0 nova_compute[187212]: 2025-11-25 19:24:33.331 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:34 compute-0 nova_compute[187212]: 2025-11-25 19:24:34.091 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:34 compute-0 nova_compute[187212]: 2025-11-25 19:24:34.091 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:34 compute-0 nova_compute[187212]: 2025-11-25 19:24:34.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:35 compute-0 nova_compute[187212]: 2025-11-25 19:24:35.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:35 compute-0 nova_compute[187212]: 2025-11-25 19:24:35.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:24:35 compute-0 nova_compute[187212]: 2025-11-25 19:24:35.688 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:24:36 compute-0 nova_compute[187212]: 2025-11-25 19:24:36.302 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:38 compute-0 nova_compute[187212]: 2025-11-25 19:24:38.334 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:38 compute-0 nova_compute[187212]: 2025-11-25 19:24:38.688 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:38 compute-0 ovn_controller[95465]: 2025-11-25T19:24:38Z|00150|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 25 19:24:39 compute-0 nova_compute[187212]: 2025-11-25 19:24:39.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:39 compute-0 nova_compute[187212]: 2025-11-25 19:24:39.175 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:24:40 compute-0 nova_compute[187212]: 2025-11-25 19:24:40.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:24:41 compute-0 nova_compute[187212]: 2025-11-25 19:24:41.305 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:42 compute-0 podman[215652]: 2025-11-25 19:24:42.156494464 +0000 UTC m=+0.078870488 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:24:43 compute-0 nova_compute[187212]: 2025-11-25 19:24:43.338 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:46 compute-0 nova_compute[187212]: 2025-11-25 19:24:46.307 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:48 compute-0 nova_compute[187212]: 2025-11-25 19:24:48.390 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:49 compute-0 podman[215677]: 2025-11-25 19:24:49.209105639 +0000 UTC m=+0.130197973 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 19:24:51 compute-0 nova_compute[187212]: 2025-11-25 19:24:51.309 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:52 compute-0 podman[215703]: 2025-11-25 19:24:52.166151033 +0000 UTC m=+0.082781990 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Nov 25 19:24:53 compute-0 nova_compute[187212]: 2025-11-25 19:24:53.393 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:56 compute-0 nova_compute[187212]: 2025-11-25 19:24:56.311 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:57 compute-0 podman[215723]: 2025-11-25 19:24:57.17392531 +0000 UTC m=+0.090883832 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Nov 25 19:24:58 compute-0 nova_compute[187212]: 2025-11-25 19:24:58.395 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:24:59 compute-0 podman[197585]: time="2025-11-25T19:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:24:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:24:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2620 "" "Go-http-client/1.1"
Nov 25 19:25:00 compute-0 podman[215745]: 2025-11-25 19:25:00.164496913 +0000 UTC m=+0.095861843 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 25 19:25:01 compute-0 nova_compute[187212]: 2025-11-25 19:25:01.347 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:01 compute-0 openstack_network_exporter[199731]: ERROR   19:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:25:01 compute-0 openstack_network_exporter[199731]: ERROR   19:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:25:01 compute-0 openstack_network_exporter[199731]: ERROR   19:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:25:01 compute-0 openstack_network_exporter[199731]: ERROR   19:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:25:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:25:01 compute-0 openstack_network_exporter[199731]: ERROR   19:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:25:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:25:02 compute-0 nova_compute[187212]: 2025-11-25 19:25:02.473 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:02 compute-0 nova_compute[187212]: 2025-11-25 19:25:02.474 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:02 compute-0 nova_compute[187212]: 2025-11-25 19:25:02.981 187216 DEBUG nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:25:03 compute-0 nova_compute[187212]: 2025-11-25 19:25:03.397 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:03 compute-0 nova_compute[187212]: 2025-11-25 19:25:03.657 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:03 compute-0 nova_compute[187212]: 2025-11-25 19:25:03.658 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:03 compute-0 nova_compute[187212]: 2025-11-25 19:25:03.682 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:25:03 compute-0 nova_compute[187212]: 2025-11-25 19:25:03.683 187216 INFO nova.compute.claims [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:25:04 compute-0 nova_compute[187212]: 2025-11-25 19:25:04.784 187216 DEBUG nova.compute.provider_tree [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:25:05 compute-0 nova_compute[187212]: 2025-11-25 19:25:05.294 187216 DEBUG nova.scheduler.client.report [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:25:05 compute-0 nova_compute[187212]: 2025-11-25 19:25:05.802 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.144s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:05 compute-0 nova_compute[187212]: 2025-11-25 19:25:05.802 187216 DEBUG nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:25:06 compute-0 nova_compute[187212]: 2025-11-25 19:25:06.315 187216 DEBUG nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:25:06 compute-0 nova_compute[187212]: 2025-11-25 19:25:06.315 187216 DEBUG nova.network.neutron [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:25:06 compute-0 nova_compute[187212]: 2025-11-25 19:25:06.316 187216 WARNING neutronclient.v2_0.client [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:25:06 compute-0 nova_compute[187212]: 2025-11-25 19:25:06.317 187216 WARNING neutronclient.v2_0.client [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:25:06 compute-0 nova_compute[187212]: 2025-11-25 19:25:06.390 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:06 compute-0 nova_compute[187212]: 2025-11-25 19:25:06.827 187216 INFO nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:25:07 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:07.021 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:25:07 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:07.022 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:25:07 compute-0 nova_compute[187212]: 2025-11-25 19:25:07.022 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:07 compute-0 nova_compute[187212]: 2025-11-25 19:25:07.185 187216 DEBUG nova.network.neutron [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Successfully created port: 60d64df6-789b-4ebc-bd01-a5d0912572f7 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:25:07 compute-0 nova_compute[187212]: 2025-11-25 19:25:07.334 187216 DEBUG nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:25:07 compute-0 nova_compute[187212]: 2025-11-25 19:25:07.865 187216 DEBUG nova.network.neutron [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Successfully updated port: 60d64df6-789b-4ebc-bd01-a5d0912572f7 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:25:07 compute-0 nova_compute[187212]: 2025-11-25 19:25:07.949 187216 DEBUG nova.compute.manager [req-1cbba672-e72b-480e-a634-c21a5dd41d08 req-b640848e-d701-4baf-863c-78d348fca7c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-changed-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:07 compute-0 nova_compute[187212]: 2025-11-25 19:25:07.950 187216 DEBUG nova.compute.manager [req-1cbba672-e72b-480e-a634-c21a5dd41d08 req-b640848e-d701-4baf-863c-78d348fca7c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Refreshing instance network info cache due to event network-changed-60d64df6-789b-4ebc-bd01-a5d0912572f7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:25:07 compute-0 nova_compute[187212]: 2025-11-25 19:25:07.950 187216 DEBUG oslo_concurrency.lockutils [req-1cbba672-e72b-480e-a634-c21a5dd41d08 req-b640848e-d701-4baf-863c-78d348fca7c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-af5c0316-71bb-4106-9081-60ea7debb485" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:25:07 compute-0 nova_compute[187212]: 2025-11-25 19:25:07.951 187216 DEBUG oslo_concurrency.lockutils [req-1cbba672-e72b-480e-a634-c21a5dd41d08 req-b640848e-d701-4baf-863c-78d348fca7c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-af5c0316-71bb-4106-9081-60ea7debb485" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:25:07 compute-0 nova_compute[187212]: 2025-11-25 19:25:07.951 187216 DEBUG nova.network.neutron [req-1cbba672-e72b-480e-a634-c21a5dd41d08 req-b640848e-d701-4baf-863c-78d348fca7c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Refreshing network info cache for port 60d64df6-789b-4ebc-bd01-a5d0912572f7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:25:08 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:08.025 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.361 187216 DEBUG nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.363 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.364 187216 INFO nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Creating image(s)
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.365 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.365 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.366 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.367 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.374 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.375 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "refresh_cache-af5c0316-71bb-4106-9081-60ea7debb485" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.376 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.398 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.456 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.457 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.458 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.459 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.465 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.466 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.478 187216 WARNING neutronclient.v2_0.client [req-1cbba672-e72b-480e-a634-c21a5dd41d08 req-b640848e-d701-4baf-863c-78d348fca7c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.526 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.527 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.582 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.584 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.585 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.671 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.672 187216 DEBUG nova.virt.disk.api [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Checking if we can resize image /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.673 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.765 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.766 187216 DEBUG nova.virt.disk.api [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Cannot resize image /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.767 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.768 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Ensure instance console log exists: /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.768 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.769 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:08 compute-0 nova_compute[187212]: 2025-11-25 19:25:08.769 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:09 compute-0 nova_compute[187212]: 2025-11-25 19:25:09.569 187216 DEBUG nova.network.neutron [req-1cbba672-e72b-480e-a634-c21a5dd41d08 req-b640848e-d701-4baf-863c-78d348fca7c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:25:10 compute-0 nova_compute[187212]: 2025-11-25 19:25:10.696 187216 DEBUG nova.network.neutron [req-1cbba672-e72b-480e-a634-c21a5dd41d08 req-b640848e-d701-4baf-863c-78d348fca7c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:25:11 compute-0 nova_compute[187212]: 2025-11-25 19:25:11.206 187216 DEBUG oslo_concurrency.lockutils [req-1cbba672-e72b-480e-a634-c21a5dd41d08 req-b640848e-d701-4baf-863c-78d348fca7c5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-af5c0316-71bb-4106-9081-60ea7debb485" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:25:11 compute-0 nova_compute[187212]: 2025-11-25 19:25:11.207 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquired lock "refresh_cache-af5c0316-71bb-4106-9081-60ea7debb485" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:25:11 compute-0 nova_compute[187212]: 2025-11-25 19:25:11.207 187216 DEBUG nova.network.neutron [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:25:11 compute-0 nova_compute[187212]: 2025-11-25 19:25:11.436 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:12 compute-0 nova_compute[187212]: 2025-11-25 19:25:12.573 187216 DEBUG nova.network.neutron [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:25:13 compute-0 podman[215782]: 2025-11-25 19:25:13.154550569 +0000 UTC m=+0.077154583 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:25:13 compute-0 nova_compute[187212]: 2025-11-25 19:25:13.401 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:13 compute-0 nova_compute[187212]: 2025-11-25 19:25:13.661 187216 WARNING neutronclient.v2_0.client [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.171 187216 DEBUG nova.network.neutron [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Updating instance_info_cache with network_info: [{"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.678 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Releasing lock "refresh_cache-af5c0316-71bb-4106-9081-60ea7debb485" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.679 187216 DEBUG nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Instance network_info: |[{"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.682 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Start _get_guest_xml network_info=[{"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.687 187216 WARNING nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.688 187216 DEBUG nova.virt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-956366604', uuid='af5c0316-71bb-4106-9081-60ea7debb485'), owner=OwnerMeta(userid='e87bb944d08a433ca7ecc2309e015e24', username='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin', projectid='e0287f0353d44a63af6cafda5ee0aa0c', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764098714.6881735) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.694 187216 DEBUG nova.virt.libvirt.host [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.694 187216 DEBUG nova.virt.libvirt.host [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.699 187216 DEBUG nova.virt.libvirt.host [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.701 187216 DEBUG nova.virt.libvirt.host [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.704 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.704 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.705 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.705 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.706 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.706 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.707 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.707 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.708 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.708 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.708 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.709 187216 DEBUG nova.virt.hardware [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.716 187216 DEBUG nova.virt.libvirt.vif [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:25:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-956366604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-956',id=17,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e0287f0353d44a63af6cafda5ee0aa0c',ramdisk_id='',reservation_id='r-ltil8qq1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:25:07Z,user_data=None,user_id='e87bb944d08a433ca7ecc2309e015e24',uuid=af5c0316-71bb-4106-9081-60ea7debb485,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.717 187216 DEBUG nova.network.os_vif_util [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Converting VIF {"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.718 187216 DEBUG nova.network.os_vif_util [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:39:91,bridge_name='br-int',has_traffic_filtering=True,id=60d64df6-789b-4ebc-bd01-a5d0912572f7,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d64df6-78') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:25:14 compute-0 nova_compute[187212]: 2025-11-25 19:25:14.720 187216 DEBUG nova.objects.instance [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lazy-loading 'pci_devices' on Instance uuid af5c0316-71bb-4106-9081-60ea7debb485 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.230 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <uuid>af5c0316-71bb-4106-9081-60ea7debb485</uuid>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <name>instance-00000011</name>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-956366604</nova:name>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:25:14</nova:creationTime>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:25:15 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:25:15 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:user uuid="e87bb944d08a433ca7ecc2309e015e24">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin</nova:user>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:project uuid="e0287f0353d44a63af6cafda5ee0aa0c">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825</nova:project>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         <nova:port uuid="60d64df6-789b-4ebc-bd01-a5d0912572f7">
Nov 25 19:25:15 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <system>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <entry name="serial">af5c0316-71bb-4106-9081-60ea7debb485</entry>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <entry name="uuid">af5c0316-71bb-4106-9081-60ea7debb485</entry>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     </system>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <os>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   </os>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <features>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   </features>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.config"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:fe:39:91"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <target dev="tap60d64df6-78"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/console.log" append="off"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <video>
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     </video>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:25:15 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:25:15 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:25:15 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:25:15 compute-0 nova_compute[187212]: </domain>
Nov 25 19:25:15 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.232 187216 DEBUG nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Preparing to wait for external event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.232 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.232 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.233 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.234 187216 DEBUG nova.virt.libvirt.vif [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:25:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-956366604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-956',id=17,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e0287f0353d44a63af6cafda5ee0aa0c',ramdisk_id='',reservation_id='r-ltil8qq1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:25:07Z,user_data=None,user_id='e87bb944d08a433ca7ecc2309e015e24',uuid=af5c0316-71bb-4106-9081-60ea7debb485,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.234 187216 DEBUG nova.network.os_vif_util [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Converting VIF {"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.235 187216 DEBUG nova.network.os_vif_util [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:39:91,bridge_name='br-int',has_traffic_filtering=True,id=60d64df6-789b-4ebc-bd01-a5d0912572f7,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d64df6-78') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.236 187216 DEBUG os_vif [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:39:91,bridge_name='br-int',has_traffic_filtering=True,id=60d64df6-789b-4ebc-bd01-a5d0912572f7,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d64df6-78') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.237 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.237 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.238 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.239 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.239 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a276625e-c705-5b88-8896-5f816f456a07', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.241 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.243 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.248 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.249 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60d64df6-78, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.249 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap60d64df6-78, col_values=(('qos', UUID('c85b0bc9-8b6f-44fb-8589-bfa54fd1e886')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.250 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap60d64df6-78, col_values=(('external_ids', {'iface-id': '60d64df6-789b-4ebc-bd01-a5d0912572f7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:39:91', 'vm-uuid': 'af5c0316-71bb-4106-9081-60ea7debb485'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.252 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:15 compute-0 NetworkManager[55552]: <info>  [1764098715.2536] manager: (tap60d64df6-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.254 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.260 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:15 compute-0 nova_compute[187212]: 2025-11-25 19:25:15.262 187216 INFO os_vif [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:39:91,bridge_name='br-int',has_traffic_filtering=True,id=60d64df6-789b-4ebc-bd01-a5d0912572f7,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d64df6-78')
Nov 25 19:25:16 compute-0 nova_compute[187212]: 2025-11-25 19:25:16.832 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:25:16 compute-0 nova_compute[187212]: 2025-11-25 19:25:16.832 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:25:16 compute-0 nova_compute[187212]: 2025-11-25 19:25:16.832 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] No VIF found with MAC fa:16:3e:fe:39:91, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:25:16 compute-0 nova_compute[187212]: 2025-11-25 19:25:16.833 187216 INFO nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Using config drive
Nov 25 19:25:17 compute-0 nova_compute[187212]: 2025-11-25 19:25:17.345 187216 WARNING neutronclient.v2_0.client [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:25:18 compute-0 nova_compute[187212]: 2025-11-25 19:25:18.403 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:18 compute-0 nova_compute[187212]: 2025-11-25 19:25:18.562 187216 INFO nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Creating config drive at /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.config
Nov 25 19:25:18 compute-0 nova_compute[187212]: 2025-11-25 19:25:18.572 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmphh8y870r execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:18 compute-0 nova_compute[187212]: 2025-11-25 19:25:18.715 187216 DEBUG oslo_concurrency.processutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmphh8y870r" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:18 compute-0 kernel: tap60d64df6-78: entered promiscuous mode
Nov 25 19:25:18 compute-0 NetworkManager[55552]: <info>  [1764098718.8186] manager: (tap60d64df6-78): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Nov 25 19:25:18 compute-0 ovn_controller[95465]: 2025-11-25T19:25:18Z|00151|binding|INFO|Claiming lport 60d64df6-789b-4ebc-bd01-a5d0912572f7 for this chassis.
Nov 25 19:25:18 compute-0 ovn_controller[95465]: 2025-11-25T19:25:18Z|00152|binding|INFO|60d64df6-789b-4ebc-bd01-a5d0912572f7: Claiming fa:16:3e:fe:39:91 10.100.0.5
Nov 25 19:25:18 compute-0 nova_compute[187212]: 2025-11-25 19:25:18.859 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:18 compute-0 nova_compute[187212]: 2025-11-25 19:25:18.871 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.884 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:39:91 10.100.0.5'], port_security=['fa:16:3e:fe:39:91 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'af5c0316-71bb-4106-9081-60ea7debb485', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e0287f0353d44a63af6cafda5ee0aa0c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b7ddaac2-ed9c-4646-93a4-964aad68db2c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e071529-0293-4440-9c70-07d9694c0383, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=60d64df6-789b-4ebc-bd01-a5d0912572f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.885 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 60d64df6-789b-4ebc-bd01-a5d0912572f7 in datapath 1d90bb72-93e5-4ff5-baa5-d0e187ade418 bound to our chassis
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.888 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1d90bb72-93e5-4ff5-baa5-d0e187ade418
Nov 25 19:25:18 compute-0 systemd-udevd[215826]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.908 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[11333050-190a-47c8-b964-b28a5d4f53f9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.909 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1d90bb72-91 in ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.912 208756 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1d90bb72-90 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.912 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[a729c00f-77dd-4024-8075-ce9e6ad44baa]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.914 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[bf573c5d-1576-4aad-9807-20845d976433]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:18 compute-0 systemd-machined[153494]: New machine qemu-14-instance-00000011.
Nov 25 19:25:18 compute-0 NetworkManager[55552]: <info>  [1764098718.9273] device (tap60d64df6-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:25:18 compute-0 NetworkManager[55552]: <info>  [1764098718.9291] device (tap60d64df6-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.937 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[8cfb3575-1a64-495c-9dd4-1c350d0d8dee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:18 compute-0 ovn_controller[95465]: 2025-11-25T19:25:18Z|00153|binding|INFO|Setting lport 60d64df6-789b-4ebc-bd01-a5d0912572f7 ovn-installed in OVS
Nov 25 19:25:18 compute-0 ovn_controller[95465]: 2025-11-25T19:25:18Z|00154|binding|INFO|Setting lport 60d64df6-789b-4ebc-bd01-a5d0912572f7 up in Southbound
Nov 25 19:25:18 compute-0 nova_compute[187212]: 2025-11-25 19:25:18.956 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:18 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000011.
Nov 25 19:25:18 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:18.963 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[99218f9e-6d8d-488a-b35c-67650e26ff31]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.010 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[f7b30711-8d37-4bd4-9814-f1a5e9ff33d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 systemd-udevd[215830]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:25:19 compute-0 NetworkManager[55552]: <info>  [1764098719.0200] manager: (tap1d90bb72-90): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.020 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd19f89-fba0-47d0-b575-2c8f93f3112b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.076 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[d3af5fe9-23e5-4c44-ad4d-802904d84a56]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.080 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[36e4daab-abb7-4dcd-818f-23c9b6954e3e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 NetworkManager[55552]: <info>  [1764098719.1089] device (tap1d90bb72-90): carrier: link connected
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.121 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[ac900104-157a-4eb6-b9f9-76235c118e6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.144 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[170374d7-ae78-4e1a-b029-765925dad23b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d90bb72-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:44:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466677, 'reachable_time': 24636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215859, 'error': None, 'target': 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.163 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[52f8c697-8a81-43e9-9ca2-1fbd5eb115db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:44de'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466677, 'tstamp': 466677}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215860, 'error': None, 'target': 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.186 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ebaee15b-19d4-4ccc-9667-c05bac76d9fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d90bb72-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:44:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466677, 'reachable_time': 24636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215861, 'error': None, 'target': 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.228 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1c72ebe7-425a-44bd-961c-93fd008daacb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.320 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1a04893f-1aee-43a1-a65b-d65aec3bdbfa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.321 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d90bb72-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.321 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.322 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d90bb72-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.323 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:19 compute-0 NetworkManager[55552]: <info>  [1764098719.3249] manager: (tap1d90bb72-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 25 19:25:19 compute-0 kernel: tap1d90bb72-90: entered promiscuous mode
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.326 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.327 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1d90bb72-90, col_values=(('external_ids', {'iface-id': 'f3db0a73-6d5e-44f5-a754-565ad86befff'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:19 compute-0 ovn_controller[95465]: 2025-11-25T19:25:19Z|00155|binding|INFO|Releasing lport f3db0a73-6d5e-44f5-a754-565ad86befff from this chassis (sb_readonly=0)
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.351 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.352 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[528d0458-1745-4dd8-97e1-e8e08f29ec49]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.352 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.353 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.353 104356 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 1d90bb72-93e5-4ff5-baa5-d0e187ade418 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.354 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.354 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[814982f9-31a2-4883-8154-a6cc326d6654]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.355 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.355 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[0c5f56ef-0ae5-4f7a-acdf-5f84cf0d7b8c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.356 104356 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: global
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     log         /dev/log local0 debug
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     log-tag     haproxy-metadata-proxy-1d90bb72-93e5-4ff5-baa5-d0e187ade418
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     user        root
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     group       root
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     maxconn     1024
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     pidfile     /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     daemon
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: defaults
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     log global
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     mode http
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     option httplog
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     option dontlognull
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     option http-server-close
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     option forwardfor
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     retries                 3
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     timeout http-request    30s
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     timeout connect         30s
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     timeout client          32s
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     timeout server          32s
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     timeout http-keep-alive 30s
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: listen listener
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     bind 169.254.169.254:80
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:     http-request add-header X-OVN-Network-ID 1d90bb72-93e5-4ff5-baa5-d0e187ade418
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 19:25:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:19.359 104356 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'env', 'PROCESS_TAG=haproxy-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1d90bb72-93e5-4ff5-baa5-d0e187ade418.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.705 187216 DEBUG nova.compute.manager [req-5b9216a9-5973-45ed-be97-5ee6514925b0 req-821eae05-d7c5-49fc-ac9b-d59a3ddaeeb5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.706 187216 DEBUG oslo_concurrency.lockutils [req-5b9216a9-5973-45ed-be97-5ee6514925b0 req-821eae05-d7c5-49fc-ac9b-d59a3ddaeeb5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.707 187216 DEBUG oslo_concurrency.lockutils [req-5b9216a9-5973-45ed-be97-5ee6514925b0 req-821eae05-d7c5-49fc-ac9b-d59a3ddaeeb5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.707 187216 DEBUG oslo_concurrency.lockutils [req-5b9216a9-5973-45ed-be97-5ee6514925b0 req-821eae05-d7c5-49fc-ac9b-d59a3ddaeeb5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.707 187216 DEBUG nova.compute.manager [req-5b9216a9-5973-45ed-be97-5ee6514925b0 req-821eae05-d7c5-49fc-ac9b-d59a3ddaeeb5 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Processing event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.807 187216 DEBUG nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.814 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.818 187216 INFO nova.virt.libvirt.driver [-] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Instance spawned successfully.
Nov 25 19:25:19 compute-0 nova_compute[187212]: 2025-11-25 19:25:19.818 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:25:19 compute-0 podman[215899]: 2025-11-25 19:25:19.785578095 +0000 UTC m=+0.038599143 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 19:25:20 compute-0 podman[215899]: 2025-11-25 19:25:20.028768898 +0000 UTC m=+0.281789896 container create 46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Nov 25 19:25:20 compute-0 systemd[1]: Started libpod-conmon-46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176.scope.
Nov 25 19:25:20 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:25:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3fe65dd2381cb7b53e12b3af346cc0e4d724c65e898ab54c7a6236c31f2c461/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 19:25:20 compute-0 nova_compute[187212]: 2025-11-25 19:25:20.252 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:20 compute-0 podman[215899]: 2025-11-25 19:25:20.301746632 +0000 UTC m=+0.554767680 container init 46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 25 19:25:20 compute-0 podman[215913]: 2025-11-25 19:25:20.314122747 +0000 UTC m=+0.235716709 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 19:25:20 compute-0 podman[215899]: 2025-11-25 19:25:20.314978309 +0000 UTC m=+0.567999307 container start 46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Nov 25 19:25:20 compute-0 nova_compute[187212]: 2025-11-25 19:25:20.336 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:25:20 compute-0 nova_compute[187212]: 2025-11-25 19:25:20.336 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:25:20 compute-0 nova_compute[187212]: 2025-11-25 19:25:20.337 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:25:20 compute-0 nova_compute[187212]: 2025-11-25 19:25:20.338 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:25:20 compute-0 nova_compute[187212]: 2025-11-25 19:25:20.339 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:25:20 compute-0 nova_compute[187212]: 2025-11-25 19:25:20.340 187216 DEBUG nova.virt.libvirt.driver [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:25:20 compute-0 neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418[215930]: [NOTICE]   (215942) : New worker (215944) forked
Nov 25 19:25:20 compute-0 neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418[215930]: [NOTICE]   (215942) : Loading success.
Nov 25 19:25:20 compute-0 nova_compute[187212]: 2025-11-25 19:25:20.855 187216 INFO nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Took 12.49 seconds to spawn the instance on the hypervisor.
Nov 25 19:25:20 compute-0 nova_compute[187212]: 2025-11-25 19:25:20.856 187216 DEBUG nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:25:21 compute-0 nova_compute[187212]: 2025-11-25 19:25:21.405 187216 INFO nova.compute.manager [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Took 17.90 seconds to build instance.
Nov 25 19:25:21 compute-0 nova_compute[187212]: 2025-11-25 19:25:21.787 187216 DEBUG nova.compute.manager [req-d54b1948-c25c-41cf-95ef-6e8b04714174 req-39406750-360b-4ee2-885a-f74249696032 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:21 compute-0 nova_compute[187212]: 2025-11-25 19:25:21.787 187216 DEBUG oslo_concurrency.lockutils [req-d54b1948-c25c-41cf-95ef-6e8b04714174 req-39406750-360b-4ee2-885a-f74249696032 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:21 compute-0 nova_compute[187212]: 2025-11-25 19:25:21.787 187216 DEBUG oslo_concurrency.lockutils [req-d54b1948-c25c-41cf-95ef-6e8b04714174 req-39406750-360b-4ee2-885a-f74249696032 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:21 compute-0 nova_compute[187212]: 2025-11-25 19:25:21.788 187216 DEBUG oslo_concurrency.lockutils [req-d54b1948-c25c-41cf-95ef-6e8b04714174 req-39406750-360b-4ee2-885a-f74249696032 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:21 compute-0 nova_compute[187212]: 2025-11-25 19:25:21.788 187216 DEBUG nova.compute.manager [req-d54b1948-c25c-41cf-95ef-6e8b04714174 req-39406750-360b-4ee2-885a-f74249696032 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] No waiting events found dispatching network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:25:21 compute-0 nova_compute[187212]: 2025-11-25 19:25:21.788 187216 WARNING nova.compute.manager [req-d54b1948-c25c-41cf-95ef-6e8b04714174 req-39406750-360b-4ee2-885a-f74249696032 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received unexpected event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 for instance with vm_state active and task_state None.
Nov 25 19:25:21 compute-0 nova_compute[187212]: 2025-11-25 19:25:21.911 187216 DEBUG oslo_concurrency.lockutils [None req-8d1cd30a-6f22-4716-a844-6c6765ba915a e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.437s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:22 compute-0 nova_compute[187212]: 2025-11-25 19:25:22.682 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:25:22 compute-0 nova_compute[187212]: 2025-11-25 19:25:22.682 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:25:22 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 25 19:25:23 compute-0 podman[215955]: 2025-11-25 19:25:23.070700847 +0000 UTC m=+0.066774481 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Nov 25 19:25:23 compute-0 nova_compute[187212]: 2025-11-25 19:25:23.404 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:25 compute-0 nova_compute[187212]: 2025-11-25 19:25:25.255 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:28 compute-0 podman[215975]: 2025-11-25 19:25:28.186275099 +0000 UTC m=+0.099240441 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Nov 25 19:25:28 compute-0 nova_compute[187212]: 2025-11-25 19:25:28.405 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:29 compute-0 nova_compute[187212]: 2025-11-25 19:25:29.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:25:29 compute-0 podman[197585]: time="2025-11-25T19:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:25:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:25:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3081 "" "Go-http-client/1.1"
Nov 25 19:25:29 compute-0 nova_compute[187212]: 2025-11-25 19:25:29.842 187216 DEBUG nova.compute.manager [None req-6e9f1a44-399f-43c1-bfa5-1e6c02edce42 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Nov 25 19:25:29 compute-0 nova_compute[187212]: 2025-11-25 19:25:29.982 187216 DEBUG nova.compute.provider_tree [None req-6e9f1a44-399f-43c1-bfa5-1e6c02edce42 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Updating resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 generation from 12 to 18 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Nov 25 19:25:30 compute-0 nova_compute[187212]: 2025-11-25 19:25:30.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:25:30 compute-0 nova_compute[187212]: 2025-11-25 19:25:30.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:25:30 compute-0 nova_compute[187212]: 2025-11-25 19:25:30.257 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:30 compute-0 nova_compute[187212]: 2025-11-25 19:25:30.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:30 compute-0 nova_compute[187212]: 2025-11-25 19:25:30.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:30 compute-0 nova_compute[187212]: 2025-11-25 19:25:30.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:30 compute-0 nova_compute[187212]: 2025-11-25 19:25:30.694 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:25:30 compute-0 podman[216000]: 2025-11-25 19:25:30.845823977 +0000 UTC m=+0.095424192 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 19:25:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:31.105 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:31.106 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:31.106 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:31 compute-0 openstack_network_exporter[199731]: ERROR   19:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:25:31 compute-0 openstack_network_exporter[199731]: ERROR   19:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:25:31 compute-0 openstack_network_exporter[199731]: ERROR   19:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:25:31 compute-0 openstack_network_exporter[199731]: ERROR   19:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:25:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:25:31 compute-0 openstack_network_exporter[199731]: ERROR   19:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:25:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:25:31 compute-0 nova_compute[187212]: 2025-11-25 19:25:31.746 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:31 compute-0 nova_compute[187212]: 2025-11-25 19:25:31.834 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:31 compute-0 nova_compute[187212]: 2025-11-25 19:25:31.836 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:31 compute-0 nova_compute[187212]: 2025-11-25 19:25:31.903 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:32 compute-0 nova_compute[187212]: 2025-11-25 19:25:32.177 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:25:32 compute-0 nova_compute[187212]: 2025-11-25 19:25:32.180 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:32 compute-0 nova_compute[187212]: 2025-11-25 19:25:32.210 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:32 compute-0 nova_compute[187212]: 2025-11-25 19:25:32.212 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5665MB free_disk=72.99189376831055GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:25:32 compute-0 nova_compute[187212]: 2025-11-25 19:25:32.212 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:32 compute-0 nova_compute[187212]: 2025-11-25 19:25:32.212 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.409 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.756 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 92eb4831-4cbb-4d68-8c0b-f9b944d28ad9 has allocations against this compute host but is not found in the database.
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.757 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.758 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:25:32 up  1:17,  0 user,  load average: 0.27, 0.33, 0.42\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.775 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.796 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.797 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.807 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.823 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STATUS_DISABLED,COMPUTE_SOUND_MODEL_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:25:33 compute-0 nova_compute[187212]: 2025-11-25 19:25:33.861 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:25:34 compute-0 nova_compute[187212]: 2025-11-25 19:25:34.367 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:25:34 compute-0 nova_compute[187212]: 2025-11-25 19:25:34.878 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:25:34 compute-0 nova_compute[187212]: 2025-11-25 19:25:34.878 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.666s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:35 compute-0 nova_compute[187212]: 2025-11-25 19:25:35.260 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:36 compute-0 ovn_controller[95465]: 2025-11-25T19:25:36Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fe:39:91 10.100.0.5
Nov 25 19:25:36 compute-0 ovn_controller[95465]: 2025-11-25T19:25:36Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fe:39:91 10.100.0.5
Nov 25 19:25:37 compute-0 nova_compute[187212]: 2025-11-25 19:25:37.874 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:25:37 compute-0 nova_compute[187212]: 2025-11-25 19:25:37.875 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:25:38 compute-0 nova_compute[187212]: 2025-11-25 19:25:38.265 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Check if temp file /var/lib/nova/instances/tmpng_rrm7c exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Nov 25 19:25:38 compute-0 nova_compute[187212]: 2025-11-25 19:25:38.270 187216 DEBUG nova.compute.manager [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpng_rrm7c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af5c0316-71bb-4106-9081-60ea7debb485',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Nov 25 19:25:38 compute-0 nova_compute[187212]: 2025-11-25 19:25:38.387 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:25:38 compute-0 nova_compute[187212]: 2025-11-25 19:25:38.387 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:25:38 compute-0 nova_compute[187212]: 2025-11-25 19:25:38.463 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:39 compute-0 nova_compute[187212]: 2025-11-25 19:25:39.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:25:40 compute-0 nova_compute[187212]: 2025-11-25 19:25:40.263 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:42 compute-0 nova_compute[187212]: 2025-11-25 19:25:42.795 187216 DEBUG oslo_concurrency.processutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:42 compute-0 nova_compute[187212]: 2025-11-25 19:25:42.883 187216 DEBUG oslo_concurrency.processutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:42 compute-0 nova_compute[187212]: 2025-11-25 19:25:42.885 187216 DEBUG oslo_concurrency.processutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:25:42 compute-0 nova_compute[187212]: 2025-11-25 19:25:42.970 187216 DEBUG oslo_concurrency.processutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:25:42 compute-0 nova_compute[187212]: 2025-11-25 19:25:42.972 187216 DEBUG nova.compute.manager [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Preparing to wait for external event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:25:42 compute-0 nova_compute[187212]: 2025-11-25 19:25:42.973 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:42 compute-0 nova_compute[187212]: 2025-11-25 19:25:42.973 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:42 compute-0 nova_compute[187212]: 2025-11-25 19:25:42.974 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:43 compute-0 nova_compute[187212]: 2025-11-25 19:25:43.466 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:44 compute-0 podman[216054]: 2025-11-25 19:25:44.188641225 +0000 UTC m=+0.091582990 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:25:45 compute-0 nova_compute[187212]: 2025-11-25 19:25:45.266 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:48 compute-0 nova_compute[187212]: 2025-11-25 19:25:48.501 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:48 compute-0 nova_compute[187212]: 2025-11-25 19:25:48.920 187216 DEBUG nova.compute.manager [req-7a4cdf62-74df-45be-8aab-93d26563c15a req-d1af95ec-62b2-4d70-9a81-c977f32d79ae 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:48 compute-0 nova_compute[187212]: 2025-11-25 19:25:48.920 187216 DEBUG oslo_concurrency.lockutils [req-7a4cdf62-74df-45be-8aab-93d26563c15a req-d1af95ec-62b2-4d70-9a81-c977f32d79ae 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:48 compute-0 nova_compute[187212]: 2025-11-25 19:25:48.921 187216 DEBUG oslo_concurrency.lockutils [req-7a4cdf62-74df-45be-8aab-93d26563c15a req-d1af95ec-62b2-4d70-9a81-c977f32d79ae 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:48 compute-0 nova_compute[187212]: 2025-11-25 19:25:48.921 187216 DEBUG oslo_concurrency.lockutils [req-7a4cdf62-74df-45be-8aab-93d26563c15a req-d1af95ec-62b2-4d70-9a81-c977f32d79ae 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:48 compute-0 nova_compute[187212]: 2025-11-25 19:25:48.921 187216 DEBUG nova.compute.manager [req-7a4cdf62-74df-45be-8aab-93d26563c15a req-d1af95ec-62b2-4d70-9a81-c977f32d79ae 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] No event matching network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 in dict_keys([('network-vif-plugged', '60d64df6-789b-4ebc-bd01-a5d0912572f7')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Nov 25 19:25:48 compute-0 nova_compute[187212]: 2025-11-25 19:25:48.922 187216 DEBUG nova.compute.manager [req-7a4cdf62-74df-45be-8aab-93d26563c15a req-d1af95ec-62b2-4d70-9a81-c977f32d79ae 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:25:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:48.935 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:25:48 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:48.936 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:25:48 compute-0 nova_compute[187212]: 2025-11-25 19:25:48.936 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:49 compute-0 ovn_controller[95465]: 2025-11-25T19:25:49Z|00156|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 25 19:25:50 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.000 187216 INFO nova.compute.manager [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Took 7.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Nov 25 19:25:50 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.268 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:50 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.996 187216 DEBUG nova.compute.manager [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:50 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.997 187216 DEBUG oslo_concurrency.lockutils [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:50 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.997 187216 DEBUG oslo_concurrency.lockutils [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:50 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.998 187216 DEBUG oslo_concurrency.lockutils [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:50 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.998 187216 DEBUG nova.compute.manager [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Processing event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:25:50 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.998 187216 DEBUG nova.compute.manager [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-changed-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:50 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.999 187216 DEBUG nova.compute.manager [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Refreshing instance network info cache due to event network-changed-60d64df6-789b-4ebc-bd01-a5d0912572f7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:25:51 compute-0 nova_compute[187212]: 2025-11-25 19:25:50.999 187216 DEBUG oslo_concurrency.lockutils [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-af5c0316-71bb-4106-9081-60ea7debb485" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:25:51 compute-0 nova_compute[187212]: 2025-11-25 19:25:51.000 187216 DEBUG oslo_concurrency.lockutils [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-af5c0316-71bb-4106-9081-60ea7debb485" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:25:51 compute-0 nova_compute[187212]: 2025-11-25 19:25:51.000 187216 DEBUG nova.network.neutron [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Refreshing network info cache for port 60d64df6-789b-4ebc-bd01-a5d0912572f7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:25:51 compute-0 nova_compute[187212]: 2025-11-25 19:25:51.002 187216 DEBUG nova.compute.manager [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:25:51 compute-0 podman[216081]: 2025-11-25 19:25:51.222620463 +0000 UTC m=+0.136676153 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 19:25:51 compute-0 nova_compute[187212]: 2025-11-25 19:25:51.512 187216 WARNING neutronclient.v2_0.client [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:25:51 compute-0 nova_compute[187212]: 2025-11-25 19:25:51.520 187216 DEBUG nova.compute.manager [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpng_rrm7c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af5c0316-71bb-4106-9081-60ea7debb485',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(92eb4831-4cbb-4d68-8c0b-f9b944d28ad9),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.042 187216 DEBUG nova.objects.instance [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'migration_context' on Instance uuid af5c0316-71bb-4106-9081-60ea7debb485 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.043 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.045 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.045 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.463 187216 WARNING neutronclient.v2_0.client [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.547 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.548 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.554 187216 DEBUG nova.virt.libvirt.vif [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:25:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-956366604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-956',id=17,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:25:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e0287f0353d44a63af6cafda5ee0aa0c',ramdisk_id='',reservation_id='r-ltil8qq1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:25:20Z,user_data=None,user_id='e87bb944d08a433ca7ecc2309e015e24',uuid=af5c0316-71bb-4106-9081-60ea7debb485,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.554 187216 DEBUG nova.network.os_vif_util [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.555 187216 DEBUG nova.network.os_vif_util [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:39:91,bridge_name='br-int',has_traffic_filtering=True,id=60d64df6-789b-4ebc-bd01-a5d0912572f7,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d64df6-78') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.556 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Updating guest XML with vif config: <interface type="ethernet">
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <mac address="fa:16:3e:fe:39:91"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <model type="virtio"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <mtu size="1442"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <target dev="tap60d64df6-78"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]: </interface>
Nov 25 19:25:52 compute-0 nova_compute[187212]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.556 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <name>instance-00000011</name>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <uuid>af5c0316-71bb-4106-9081-60ea7debb485</uuid>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-956366604</nova:name>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:25:14</nova:creationTime>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:25:52 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:25:52 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:user uuid="e87bb944d08a433ca7ecc2309e015e24">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin</nova:user>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:project uuid="e0287f0353d44a63af6cafda5ee0aa0c">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825</nova:project>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:port uuid="60d64df6-789b-4ebc-bd01-a5d0912572f7">
Nov 25 19:25:52 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <memory unit="KiB">131072</memory>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <currentMemory unit="KiB">131072</currentMemory>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <vcpu placement="static">1</vcpu>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <resource>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <partition>/machine</partition>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </resource>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <system>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="serial">af5c0316-71bb-4106-9081-60ea7debb485</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="uuid">af5c0316-71bb-4106-9081-60ea7debb485</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </system>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <os>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </os>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <features>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <vmcoreinfo state="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </features>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact" check="partial">
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <model fallback="allow">Nehalem</model>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <on_poweroff>destroy</on_poweroff>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <on_reboot>restart</on_reboot>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <on_crash>destroy</on_crash>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.config"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <readonly/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="0" model="pcie-root"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="1" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="1" port="0x10"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="2" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="2" port="0x11"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="3" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="3" port="0x12"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="4" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="4" port="0x13"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="5" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="5" port="0x14"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="6" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="6" port="0x15"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="7" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="7" port="0x16"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="8" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="8" port="0x17"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="9" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="9" port="0x18"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="10" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="10" port="0x19"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="11" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="11" port="0x1a"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="12" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="12" port="0x1b"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="13" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="13" port="0x1c"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="14" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="14" port="0x1d"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="15" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="15" port="0x1e"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="16" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="16" port="0x1f"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="17" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="17" port="0x20"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="18" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="18" port="0x21"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="19" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="19" port="0x22"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="20" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="20" port="0x23"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="21" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="21" port="0x24"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="22" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="22" port="0x25"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="23" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="23" port="0x26"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="24" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="24" port="0x27"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="25" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="25" port="0x28"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-pci-bridge"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="usb" index="0" model="piix3-uhci">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="sata" index="0">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <interface type="ethernet"><mac address="fa:16:3e:fe:39:91"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap60d64df6-78"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </interface><serial type="pty">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/console.log" append="off"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target type="isa-serial" port="0">
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <model name="isa-serial"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </target>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <console type="pty">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/console.log" append="off"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target type="serial" port="0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </console>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="usb" bus="0" port="1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </input>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <input type="mouse" bus="ps2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <listen type="address" address="::"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </graphics>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <video>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model type="virtio" heads="1" primary="yes"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </video>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]: </domain>
Nov 25 19:25:52 compute-0 nova_compute[187212]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.558 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <name>instance-00000011</name>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <uuid>af5c0316-71bb-4106-9081-60ea7debb485</uuid>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-956366604</nova:name>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:25:14</nova:creationTime>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:25:52 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:25:52 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:user uuid="e87bb944d08a433ca7ecc2309e015e24">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin</nova:user>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:project uuid="e0287f0353d44a63af6cafda5ee0aa0c">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825</nova:project>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:port uuid="60d64df6-789b-4ebc-bd01-a5d0912572f7">
Nov 25 19:25:52 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <memory unit="KiB">131072</memory>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <currentMemory unit="KiB">131072</currentMemory>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <vcpu placement="static">1</vcpu>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <resource>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <partition>/machine</partition>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </resource>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <system>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="serial">af5c0316-71bb-4106-9081-60ea7debb485</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="uuid">af5c0316-71bb-4106-9081-60ea7debb485</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </system>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <os>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </os>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <features>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <vmcoreinfo state="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </features>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact" check="partial">
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <model fallback="allow">Nehalem</model>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <on_poweroff>destroy</on_poweroff>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <on_reboot>restart</on_reboot>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <on_crash>destroy</on_crash>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.config"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <readonly/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="0" model="pcie-root"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="1" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="1" port="0x10"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="2" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="2" port="0x11"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="3" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="3" port="0x12"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="4" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="4" port="0x13"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="5" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="5" port="0x14"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="6" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="6" port="0x15"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="7" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="7" port="0x16"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="8" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="8" port="0x17"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="9" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="9" port="0x18"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="10" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="10" port="0x19"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="11" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="11" port="0x1a"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="12" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="12" port="0x1b"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="13" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="13" port="0x1c"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="14" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="14" port="0x1d"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="15" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="15" port="0x1e"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="16" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="16" port="0x1f"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="17" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="17" port="0x20"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="18" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="18" port="0x21"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="19" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="19" port="0x22"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="20" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="20" port="0x23"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="21" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="21" port="0x24"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="22" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="22" port="0x25"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="23" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="23" port="0x26"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="24" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="24" port="0x27"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="25" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="25" port="0x28"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-pci-bridge"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="usb" index="0" model="piix3-uhci">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="sata" index="0">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <interface type="ethernet"><mac address="fa:16:3e:fe:39:91"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap60d64df6-78"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </interface><serial type="pty">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/console.log" append="off"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target type="isa-serial" port="0">
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <model name="isa-serial"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </target>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <console type="pty">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/console.log" append="off"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target type="serial" port="0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </console>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="usb" bus="0" port="1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </input>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <input type="mouse" bus="ps2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <listen type="address" address="::"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </graphics>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <video>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model type="virtio" heads="1" primary="yes"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </video>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]: </domain>
Nov 25 19:25:52 compute-0 nova_compute[187212]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.558 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] _update_pci_xml output xml=<domain type="kvm">
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <name>instance-00000011</name>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <uuid>af5c0316-71bb-4106-9081-60ea7debb485</uuid>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-956366604</nova:name>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:25:14</nova:creationTime>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:25:52 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:25:52 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:user uuid="e87bb944d08a433ca7ecc2309e015e24">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin</nova:user>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:project uuid="e0287f0353d44a63af6cafda5ee0aa0c">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825</nova:project>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <nova:port uuid="60d64df6-789b-4ebc-bd01-a5d0912572f7">
Nov 25 19:25:52 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <memory unit="KiB">131072</memory>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <currentMemory unit="KiB">131072</currentMemory>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <vcpu placement="static">1</vcpu>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <resource>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <partition>/machine</partition>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </resource>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <system>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="serial">af5c0316-71bb-4106-9081-60ea7debb485</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="uuid">af5c0316-71bb-4106-9081-60ea7debb485</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </system>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <os>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </os>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <features>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <vmcoreinfo state="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </features>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact" check="partial">
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <model fallback="allow">Nehalem</model>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <on_poweroff>destroy</on_poweroff>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <on_reboot>restart</on_reboot>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <on_crash>destroy</on_crash>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/disk.config"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <readonly/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="0" model="pcie-root"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="1" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="1" port="0x10"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="2" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="2" port="0x11"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="3" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="3" port="0x12"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="4" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="4" port="0x13"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="5" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="5" port="0x14"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="6" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="6" port="0x15"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="7" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="7" port="0x16"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="8" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="8" port="0x17"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="9" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="9" port="0x18"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="10" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="10" port="0x19"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="11" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="11" port="0x1a"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="12" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="12" port="0x1b"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="13" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="13" port="0x1c"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="14" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="14" port="0x1d"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="15" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="15" port="0x1e"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="16" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="16" port="0x1f"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="17" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="17" port="0x20"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="18" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="18" port="0x21"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="19" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="19" port="0x22"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="20" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="20" port="0x23"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="21" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="21" port="0x24"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="22" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="22" port="0x25"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="23" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="23" port="0x26"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="24" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="24" port="0x27"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="25" model="pcie-root-port">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-root-port"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target chassis="25" port="0x28"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model name="pcie-pci-bridge"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="usb" index="0" model="piix3-uhci">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <controller type="sata" index="0">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </controller>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <interface type="ethernet"><mac address="fa:16:3e:fe:39:91"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap60d64df6-78"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </interface><serial type="pty">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/console.log" append="off"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target type="isa-serial" port="0">
Nov 25 19:25:52 compute-0 nova_compute[187212]:         <model name="isa-serial"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       </target>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <console type="pty">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485/console.log" append="off"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <target type="serial" port="0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </console>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="usb" bus="0" port="1"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </input>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <input type="mouse" bus="ps2"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <listen type="address" address="::"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </graphics>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <video>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <model type="virtio" heads="1" primary="yes"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </video>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:25:52 compute-0 nova_compute[187212]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:25:52 compute-0 nova_compute[187212]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Nov 25 19:25:52 compute-0 nova_compute[187212]: </domain>
Nov 25 19:25:52 compute-0 nova_compute[187212]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.559 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.922 187216 DEBUG nova.network.neutron [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Updated VIF entry in instance network info cache for port 60d64df6-789b-4ebc-bd01-a5d0912572f7. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Nov 25 19:25:52 compute-0 nova_compute[187212]: 2025-11-25 19:25:52.924 187216 DEBUG nova.network.neutron [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Updating instance_info_cache with network_info: [{"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:25:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:52.937 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:53 compute-0 nova_compute[187212]: 2025-11-25 19:25:53.051 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Nov 25 19:25:53 compute-0 nova_compute[187212]: 2025-11-25 19:25:53.051 187216 INFO nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 25 19:25:53 compute-0 nova_compute[187212]: 2025-11-25 19:25:53.434 187216 DEBUG oslo_concurrency.lockutils [req-e73ea6aa-64fc-40c2-906e-355a5b561f9e req-f0c3db1f-138a-40e2-9f9f-c47dc1d5c76c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-af5c0316-71bb-4106-9081-60ea7debb485" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:25:53 compute-0 nova_compute[187212]: 2025-11-25 19:25:53.584 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:54 compute-0 nova_compute[187212]: 2025-11-25 19:25:54.070 187216 INFO nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 25 19:25:54 compute-0 podman[216108]: 2025-11-25 19:25:54.179218425 +0000 UTC m=+0.083989432 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:25:54 compute-0 nova_compute[187212]: 2025-11-25 19:25:54.573 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Nov 25 19:25:54 compute-0 nova_compute[187212]: 2025-11-25 19:25:54.574 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.078 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.079 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.270 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.586 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.587 187216 DEBUG nova.virt.libvirt.migration [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Nov 25 19:25:55 compute-0 kernel: tap60d64df6-78 (unregistering): left promiscuous mode
Nov 25 19:25:55 compute-0 NetworkManager[55552]: <info>  [1764098755.6067] device (tap60d64df6-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.618 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:55 compute-0 ovn_controller[95465]: 2025-11-25T19:25:55Z|00157|binding|INFO|Releasing lport 60d64df6-789b-4ebc-bd01-a5d0912572f7 from this chassis (sb_readonly=0)
Nov 25 19:25:55 compute-0 ovn_controller[95465]: 2025-11-25T19:25:55Z|00158|binding|INFO|Setting lport 60d64df6-789b-4ebc-bd01-a5d0912572f7 down in Southbound
Nov 25 19:25:55 compute-0 ovn_controller[95465]: 2025-11-25T19:25:55Z|00159|binding|INFO|Removing iface tap60d64df6-78 ovn-installed in OVS
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.622 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.632 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:39:91 10.100.0.5'], port_security=['fa:16:3e:fe:39:91 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'e4cabb6d-41dc-47ab-8bea-ab69f1d603df'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'af5c0316-71bb-4106-9081-60ea7debb485', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e0287f0353d44a63af6cafda5ee0aa0c', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'b7ddaac2-ed9c-4646-93a4-964aad68db2c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e071529-0293-4440-9c70-07d9694c0383, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=60d64df6-789b-4ebc-bd01-a5d0912572f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.635 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 60d64df6-789b-4ebc-bd01-a5d0912572f7 in datapath 1d90bb72-93e5-4ff5-baa5-d0e187ade418 unbound from our chassis
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.637 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d90bb72-93e5-4ff5-baa5-d0e187ade418, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.640 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[f00cda8b-7d4d-4c19-994a-0c66e63f9a99]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.642 104356 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418 namespace which is not needed anymore
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.651 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:55 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000011.scope: Deactivated successfully.
Nov 25 19:25:55 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000011.scope: Consumed 16.062s CPU time.
Nov 25 19:25:55 compute-0 systemd-machined[153494]: Machine qemu-14-instance-00000011 terminated.
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.812 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:55 compute-0 neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418[215930]: [NOTICE]   (215942) : haproxy version is 3.0.5-8e879a5
Nov 25 19:25:55 compute-0 neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418[215930]: [NOTICE]   (215942) : path to executable is /usr/sbin/haproxy
Nov 25 19:25:55 compute-0 neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418[215930]: [WARNING]  (215942) : Exiting Master process...
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.820 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:55 compute-0 podman[216154]: 2025-11-25 19:25:55.819341505 +0000 UTC m=+0.047513846 container kill 46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Nov 25 19:25:55 compute-0 neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418[215930]: [ALERT]    (215942) : Current worker (215944) exited with code 143 (Terminated)
Nov 25 19:25:55 compute-0 neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418[215930]: [WARNING]  (215942) : All workers exited. Exiting... (0)
Nov 25 19:25:55 compute-0 systemd[1]: libpod-46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176.scope: Deactivated successfully.
Nov 25 19:25:55 compute-0 conmon[215930]: conmon 46578fadce1458018d05 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176.scope/container/memory.events
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.880 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.880 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Nov 25 19:25:55 compute-0 nova_compute[187212]: 2025-11-25 19:25:55.883 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Nov 25 19:25:55 compute-0 podman[216175]: 2025-11-25 19:25:55.901329104 +0000 UTC m=+0.052432065 container died 46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Nov 25 19:25:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176-userdata-shm.mount: Deactivated successfully.
Nov 25 19:25:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-e3fe65dd2381cb7b53e12b3af346cc0e4d724c65e898ab54c7a6236c31f2c461-merged.mount: Deactivated successfully.
Nov 25 19:25:55 compute-0 podman[216175]: 2025-11-25 19:25:55.953591933 +0000 UTC m=+0.104694864 container cleanup 46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Nov 25 19:25:55 compute-0 systemd[1]: libpod-conmon-46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176.scope: Deactivated successfully.
Nov 25 19:25:55 compute-0 podman[216190]: 2025-11-25 19:25:55.980589421 +0000 UTC m=+0.109555632 container remove 46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.990 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[02b80527-e40c-4760-8090-907a47729b67]: (4, ("Tue Nov 25 07:25:55 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418 (46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176)\n46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176\nTue Nov 25 07:25:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418 (46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176)\n46578fadce1458018d05245ae48022d3607d0632ea0c320a3479cd8b9a96f176\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.992 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[610553a4-48e0-4434-b8b2-98885f53e4e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.993 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.994 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d795a0-f16a-4a5a-b106-c23c9b76db75]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:55 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:55.995 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d90bb72-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.023 187216 DEBUG nova.compute.manager [req-163f8499-eafc-46d3-8ec9-086008ae24ec req-1f214cd5-8bcc-4b59-90b2-a23d08194477 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.024 187216 DEBUG oslo_concurrency.lockutils [req-163f8499-eafc-46d3-8ec9-086008ae24ec req-1f214cd5-8bcc-4b59-90b2-a23d08194477 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.025 187216 DEBUG oslo_concurrency.lockutils [req-163f8499-eafc-46d3-8ec9-086008ae24ec req-1f214cd5-8bcc-4b59-90b2-a23d08194477 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.025 187216 DEBUG oslo_concurrency.lockutils [req-163f8499-eafc-46d3-8ec9-086008ae24ec req-1f214cd5-8bcc-4b59-90b2-a23d08194477 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.025 187216 DEBUG nova.compute.manager [req-163f8499-eafc-46d3-8ec9-086008ae24ec req-1f214cd5-8bcc-4b59-90b2-a23d08194477 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] No waiting events found dispatching network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.026 187216 DEBUG nova.compute.manager [req-163f8499-eafc-46d3-8ec9-086008ae24ec req-1f214cd5-8bcc-4b59-90b2-a23d08194477 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.044 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:56 compute-0 kernel: tap1d90bb72-90: left promiscuous mode
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.074 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.076 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:56.079 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[d15bd657-6e09-4723-8faa-7f956ea374c3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.090 187216 DEBUG nova.virt.libvirt.guest [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'af5c0316-71bb-4106-9081-60ea7debb485' (instance-00000011) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.092 187216 INFO nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Migration operation has completed
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.093 187216 INFO nova.compute.manager [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] _post_live_migration() is started..
Nov 25 19:25:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:56.098 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[2a81f95c-6648-4f93-be32-38a486a71306]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:56.099 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[42696094-2baa-4da2-941d-871ed2bd76d7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.109 187216 WARNING neutronclient.v2_0.client [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.110 187216 WARNING neutronclient.v2_0.client [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:25:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:56.122 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[46a3f3a3-7098-4cae-b8d4-565387d08e19]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466666, 'reachable_time': 31484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216220, 'error': None, 'target': 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:56.125 104475 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 19:25:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:25:56.125 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[12942a42-bfb1-403a-ae9c-57a1d494926b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:25:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d1d90bb72\x2d93e5\x2d4ff5\x2dbaa5\x2dd0e187ade418.mount: Deactivated successfully.
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.423 187216 DEBUG nova.compute.manager [req-598d3045-eb52-4ba1-96d5-b06b6bb88401 req-fbdc83cc-9126-4da8-a696-a5868f4d298c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.424 187216 DEBUG oslo_concurrency.lockutils [req-598d3045-eb52-4ba1-96d5-b06b6bb88401 req-fbdc83cc-9126-4da8-a696-a5868f4d298c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.424 187216 DEBUG oslo_concurrency.lockutils [req-598d3045-eb52-4ba1-96d5-b06b6bb88401 req-fbdc83cc-9126-4da8-a696-a5868f4d298c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.424 187216 DEBUG oslo_concurrency.lockutils [req-598d3045-eb52-4ba1-96d5-b06b6bb88401 req-fbdc83cc-9126-4da8-a696-a5868f4d298c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.425 187216 DEBUG nova.compute.manager [req-598d3045-eb52-4ba1-96d5-b06b6bb88401 req-fbdc83cc-9126-4da8-a696-a5868f4d298c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] No waiting events found dispatching network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.425 187216 DEBUG nova.compute.manager [req-598d3045-eb52-4ba1-96d5-b06b6bb88401 req-fbdc83cc-9126-4da8-a696-a5868f4d298c 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.587 187216 DEBUG nova.network.neutron [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Activated binding for port 60d64df6-789b-4ebc-bd01-a5d0912572f7 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.588 187216 DEBUG nova.compute.manager [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.589 187216 DEBUG nova.virt.libvirt.vif [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:25:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-956366604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-956',id=17,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:25:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e0287f0353d44a63af6cafda5ee0aa0c',ramdisk_id='',reservation_id='r-ltil8qq1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:25:32Z,user_data=None,user_id='e87bb944d08a433ca7ecc2309e015e24',uuid=af5c0316-71bb-4106-9081-60ea7debb485,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.589 187216 DEBUG nova.network.os_vif_util [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converting VIF {"id": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "address": "fa:16:3e:fe:39:91", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d64df6-78", "ovs_interfaceid": "60d64df6-789b-4ebc-bd01-a5d0912572f7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.590 187216 DEBUG nova.network.os_vif_util [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:39:91,bridge_name='br-int',has_traffic_filtering=True,id=60d64df6-789b-4ebc-bd01-a5d0912572f7,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d64df6-78') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.591 187216 DEBUG os_vif [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:39:91,bridge_name='br-int',has_traffic_filtering=True,id=60d64df6-789b-4ebc-bd01-a5d0912572f7,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d64df6-78') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.594 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.594 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60d64df6-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.596 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.598 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.599 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.600 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c85b0bc9-8b6f-44fb-8589-bfa54fd1e886) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.601 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.602 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.604 187216 INFO os_vif [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:39:91,bridge_name='br-int',has_traffic_filtering=True,id=60d64df6-789b-4ebc-bd01-a5d0912572f7,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d64df6-78')
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.605 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.605 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.606 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.606 187216 DEBUG nova.compute.manager [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.606 187216 INFO nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Deleting instance files /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485_del
Nov 25 19:25:56 compute-0 nova_compute[187212]: 2025-11-25 19:25:56.608 187216 INFO nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Deletion of /var/lib/nova/instances/af5c0316-71bb-4106-9081-60ea7debb485_del complete
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.095 187216 DEBUG nova.compute.manager [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.095 187216 DEBUG oslo_concurrency.lockutils [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.096 187216 DEBUG oslo_concurrency.lockutils [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.097 187216 DEBUG oslo_concurrency.lockutils [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.097 187216 DEBUG nova.compute.manager [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] No waiting events found dispatching network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.097 187216 WARNING nova.compute.manager [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received unexpected event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 for instance with vm_state active and task_state migrating.
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.098 187216 DEBUG nova.compute.manager [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.098 187216 DEBUG oslo_concurrency.lockutils [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.098 187216 DEBUG oslo_concurrency.lockutils [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.099 187216 DEBUG oslo_concurrency.lockutils [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.099 187216 DEBUG nova.compute.manager [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] No waiting events found dispatching network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.099 187216 DEBUG nova.compute.manager [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-unplugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.099 187216 DEBUG nova.compute.manager [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.100 187216 DEBUG oslo_concurrency.lockutils [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.100 187216 DEBUG oslo_concurrency.lockutils [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.100 187216 DEBUG oslo_concurrency.lockutils [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.100 187216 DEBUG nova.compute.manager [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] No waiting events found dispatching network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.101 187216 WARNING nova.compute.manager [req-fdeaafc5-a8d2-4a4a-93c1-06562fe345e0 req-7affe0d4-6eb5-4f35-a4bb-7cd92faa3979 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received unexpected event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 for instance with vm_state active and task_state migrating.
Nov 25 19:25:58 compute-0 nova_compute[187212]: 2025-11-25 19:25:58.587 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:25:59 compute-0 podman[216221]: 2025-11-25 19:25:59.159272973 +0000 UTC m=+0.084951838 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible)
Nov 25 19:25:59 compute-0 podman[197585]: time="2025-11-25T19:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:25:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:25:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2623 "" "Go-http-client/1.1"
Nov 25 19:26:00 compute-0 nova_compute[187212]: 2025-11-25 19:26:00.207 187216 DEBUG nova.compute.manager [req-0ee85eff-9aa3-40b4-aaca-a66aef391cca req-b1322076-26ce-4f31-8f34-61d1cd8e6883 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:26:00 compute-0 nova_compute[187212]: 2025-11-25 19:26:00.207 187216 DEBUG oslo_concurrency.lockutils [req-0ee85eff-9aa3-40b4-aaca-a66aef391cca req-b1322076-26ce-4f31-8f34-61d1cd8e6883 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:26:00 compute-0 nova_compute[187212]: 2025-11-25 19:26:00.208 187216 DEBUG oslo_concurrency.lockutils [req-0ee85eff-9aa3-40b4-aaca-a66aef391cca req-b1322076-26ce-4f31-8f34-61d1cd8e6883 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:26:00 compute-0 nova_compute[187212]: 2025-11-25 19:26:00.208 187216 DEBUG oslo_concurrency.lockutils [req-0ee85eff-9aa3-40b4-aaca-a66aef391cca req-b1322076-26ce-4f31-8f34-61d1cd8e6883 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:26:00 compute-0 nova_compute[187212]: 2025-11-25 19:26:00.208 187216 DEBUG nova.compute.manager [req-0ee85eff-9aa3-40b4-aaca-a66aef391cca req-b1322076-26ce-4f31-8f34-61d1cd8e6883 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] No waiting events found dispatching network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:26:00 compute-0 nova_compute[187212]: 2025-11-25 19:26:00.209 187216 WARNING nova.compute.manager [req-0ee85eff-9aa3-40b4-aaca-a66aef391cca req-b1322076-26ce-4f31-8f34-61d1cd8e6883 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Received unexpected event network-vif-plugged-60d64df6-789b-4ebc-bd01-a5d0912572f7 for instance with vm_state active and task_state migrating.
Nov 25 19:26:01 compute-0 podman[216243]: 2025-11-25 19:26:01.141145522 +0000 UTC m=+0.075390978 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Nov 25 19:26:01 compute-0 openstack_network_exporter[199731]: ERROR   19:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:26:01 compute-0 openstack_network_exporter[199731]: ERROR   19:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:26:01 compute-0 openstack_network_exporter[199731]: ERROR   19:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:26:01 compute-0 openstack_network_exporter[199731]: ERROR   19:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:26:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:26:01 compute-0 openstack_network_exporter[199731]: ERROR   19:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:26:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:26:01 compute-0 nova_compute[187212]: 2025-11-25 19:26:01.602 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:03 compute-0 nova_compute[187212]: 2025-11-25 19:26:03.633 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:06 compute-0 nova_compute[187212]: 2025-11-25 19:26:06.604 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:08 compute-0 nova_compute[187212]: 2025-11-25 19:26:08.651 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "af5c0316-71bb-4106-9081-60ea7debb485-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:26:08 compute-0 nova_compute[187212]: 2025-11-25 19:26:08.652 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:26:08 compute-0 nova_compute[187212]: 2025-11-25 19:26:08.653 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "af5c0316-71bb-4106-9081-60ea7debb485-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:26:08 compute-0 nova_compute[187212]: 2025-11-25 19:26:08.680 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.200 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.200 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.201 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.201 187216 DEBUG nova.compute.resource_tracker [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.460 187216 WARNING nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.462 187216 DEBUG oslo_concurrency.processutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.496 187216 DEBUG oslo_concurrency.processutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.497 187216 DEBUG nova.compute.resource_tracker [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5821MB free_disk=72.99237823486328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.498 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:26:09 compute-0 nova_compute[187212]: 2025-11-25 19:26:09.498 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:26:10 compute-0 nova_compute[187212]: 2025-11-25 19:26:10.528 187216 DEBUG nova.compute.resource_tracker [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Migration for instance af5c0316-71bb-4106-9081-60ea7debb485 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Nov 25 19:26:11 compute-0 nova_compute[187212]: 2025-11-25 19:26:11.037 187216 DEBUG nova.compute.resource_tracker [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Nov 25 19:26:11 compute-0 nova_compute[187212]: 2025-11-25 19:26:11.090 187216 DEBUG nova.compute.resource_tracker [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Migration 92eb4831-4cbb-4d68-8c0b-f9b944d28ad9 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Nov 25 19:26:11 compute-0 nova_compute[187212]: 2025-11-25 19:26:11.091 187216 DEBUG nova.compute.resource_tracker [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:26:11 compute-0 nova_compute[187212]: 2025-11-25 19:26:11.091 187216 DEBUG nova.compute.resource_tracker [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:26:09 up  1:18,  0 user,  load average: 0.32, 0.35, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:26:11 compute-0 nova_compute[187212]: 2025-11-25 19:26:11.137 187216 DEBUG nova.compute.provider_tree [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:26:11 compute-0 nova_compute[187212]: 2025-11-25 19:26:11.606 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:11 compute-0 nova_compute[187212]: 2025-11-25 19:26:11.645 187216 DEBUG nova.scheduler.client.report [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:26:12 compute-0 nova_compute[187212]: 2025-11-25 19:26:12.050 187216 DEBUG nova.compute.manager [None req-c42edbec-e523-4e3c-b1e3-f30c3441ed00 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Nov 25 19:26:12 compute-0 nova_compute[187212]: 2025-11-25 19:26:12.129 187216 DEBUG nova.compute.provider_tree [None req-c42edbec-e523-4e3c-b1e3-f30c3441ed00 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Updating resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 generation from 19 to 20 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Nov 25 19:26:12 compute-0 nova_compute[187212]: 2025-11-25 19:26:12.157 187216 DEBUG nova.compute.resource_tracker [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:26:12 compute-0 nova_compute[187212]: 2025-11-25 19:26:12.158 187216 DEBUG oslo_concurrency.lockutils [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.659s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:26:12 compute-0 nova_compute[187212]: 2025-11-25 19:26:12.178 187216 INFO nova.compute.manager [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Nov 25 19:26:13 compute-0 nova_compute[187212]: 2025-11-25 19:26:13.273 187216 INFO nova.scheduler.client.report [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Deleted allocation for migration 92eb4831-4cbb-4d68-8c0b-f9b944d28ad9
Nov 25 19:26:13 compute-0 nova_compute[187212]: 2025-11-25 19:26:13.274 187216 DEBUG nova.virt.libvirt.driver [None req-dedfc75c-e032-4f7f-94c1-d069dec41ec6 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: af5c0316-71bb-4106-9081-60ea7debb485] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Nov 25 19:26:13 compute-0 nova_compute[187212]: 2025-11-25 19:26:13.682 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:15 compute-0 podman[216265]: 2025-11-25 19:26:15.151926456 +0000 UTC m=+0.074206205 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:26:16 compute-0 nova_compute[187212]: 2025-11-25 19:26:16.609 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:18 compute-0 nova_compute[187212]: 2025-11-25 19:26:18.719 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:21 compute-0 nova_compute[187212]: 2025-11-25 19:26:21.612 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:22 compute-0 podman[216289]: 2025-11-25 19:26:22.247150248 +0000 UTC m=+0.169380330 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:26:23 compute-0 nova_compute[187212]: 2025-11-25 19:26:23.719 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:24 compute-0 nova_compute[187212]: 2025-11-25 19:26:24.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:26:24 compute-0 nova_compute[187212]: 2025-11-25 19:26:24.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:26:25 compute-0 podman[216315]: 2025-11-25 19:26:25.174021211 +0000 UTC m=+0.092519026 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:26:26 compute-0 nova_compute[187212]: 2025-11-25 19:26:26.654 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:28 compute-0 nova_compute[187212]: 2025-11-25 19:26:28.721 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:29 compute-0 nova_compute[187212]: 2025-11-25 19:26:29.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:26:29 compute-0 podman[197585]: time="2025-11-25T19:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:26:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:26:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2617 "" "Go-http-client/1.1"
Nov 25 19:26:30 compute-0 podman[216335]: 2025-11-25 19:26:30.167564395 +0000 UTC m=+0.089019916 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 19:26:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:26:31.107 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:26:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:26:31.107 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:26:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:26:31.108 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:26:31 compute-0 nova_compute[187212]: 2025-11-25 19:26:31.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:26:31 compute-0 openstack_network_exporter[199731]: ERROR   19:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:26:31 compute-0 openstack_network_exporter[199731]: ERROR   19:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:26:31 compute-0 openstack_network_exporter[199731]: ERROR   19:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:26:31 compute-0 openstack_network_exporter[199731]: ERROR   19:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:26:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:26:31 compute-0 openstack_network_exporter[199731]: ERROR   19:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:26:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:26:31 compute-0 nova_compute[187212]: 2025-11-25 19:26:31.656 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:32 compute-0 podman[216358]: 2025-11-25 19:26:32.164141738 +0000 UTC m=+0.085584284 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.685 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.685 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.686 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.909 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.911 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.943 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.944 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5820MB free_disk=72.99235916137695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.945 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:26:32 compute-0 nova_compute[187212]: 2025-11-25 19:26:32.945 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:26:33 compute-0 nova_compute[187212]: 2025-11-25 19:26:33.763 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:34 compute-0 nova_compute[187212]: 2025-11-25 19:26:34.064 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:26:34 compute-0 nova_compute[187212]: 2025-11-25 19:26:34.064 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:26:32 up  1:19,  0 user,  load average: 0.29, 0.34, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:26:34 compute-0 nova_compute[187212]: 2025-11-25 19:26:34.158 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:26:34 compute-0 nova_compute[187212]: 2025-11-25 19:26:34.666 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:26:35 compute-0 nova_compute[187212]: 2025-11-25 19:26:35.179 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:26:35 compute-0 nova_compute[187212]: 2025-11-25 19:26:35.179 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.234s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:26:36 compute-0 nova_compute[187212]: 2025-11-25 19:26:36.177 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:26:36 compute-0 nova_compute[187212]: 2025-11-25 19:26:36.178 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:26:36 compute-0 nova_compute[187212]: 2025-11-25 19:26:36.178 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:26:36 compute-0 nova_compute[187212]: 2025-11-25 19:26:36.659 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:38 compute-0 nova_compute[187212]: 2025-11-25 19:26:38.764 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:41 compute-0 nova_compute[187212]: 2025-11-25 19:26:41.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:26:41 compute-0 nova_compute[187212]: 2025-11-25 19:26:41.662 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:43 compute-0 nova_compute[187212]: 2025-11-25 19:26:43.766 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:46 compute-0 podman[216380]: 2025-11-25 19:26:46.194500066 +0000 UTC m=+0.108158345 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:26:46 compute-0 nova_compute[187212]: 2025-11-25 19:26:46.665 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:48 compute-0 nova_compute[187212]: 2025-11-25 19:26:48.802 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:51 compute-0 sshd-session[216405]: Connection closed by 209.38.103.174 port 34354
Nov 25 19:26:51 compute-0 nova_compute[187212]: 2025-11-25 19:26:51.667 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:53 compute-0 podman[216406]: 2025-11-25 19:26:53.233255929 +0000 UTC m=+0.157377686 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:26:53 compute-0 nova_compute[187212]: 2025-11-25 19:26:53.861 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:56 compute-0 podman[216432]: 2025-11-25 19:26:56.169855887 +0000 UTC m=+0.085258836 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 25 19:26:56 compute-0 nova_compute[187212]: 2025-11-25 19:26:56.669 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:57 compute-0 nova_compute[187212]: 2025-11-25 19:26:57.368 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "f71d9429-2da3-4b6b-b82d-63027e46f952" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:26:57 compute-0 nova_compute[187212]: 2025-11-25 19:26:57.368 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:26:57 compute-0 nova_compute[187212]: 2025-11-25 19:26:57.875 187216 DEBUG nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:26:58 compute-0 nova_compute[187212]: 2025-11-25 19:26:58.431 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:26:58 compute-0 nova_compute[187212]: 2025-11-25 19:26:58.431 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:26:58 compute-0 nova_compute[187212]: 2025-11-25 19:26:58.439 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:26:58 compute-0 nova_compute[187212]: 2025-11-25 19:26:58.439 187216 INFO nova.compute.claims [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:26:58 compute-0 nova_compute[187212]: 2025-11-25 19:26:58.864 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:26:59 compute-0 nova_compute[187212]: 2025-11-25 19:26:59.559 187216 DEBUG nova.compute.provider_tree [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:26:59 compute-0 podman[197585]: time="2025-11-25T19:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:26:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17076 "" "Go-http-client/1.1"
Nov 25 19:26:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2623 "" "Go-http-client/1.1"
Nov 25 19:27:00 compute-0 nova_compute[187212]: 2025-11-25 19:27:00.069 187216 DEBUG nova.scheduler.client.report [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:27:00 compute-0 nova_compute[187212]: 2025-11-25 19:27:00.583 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.152s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:00 compute-0 nova_compute[187212]: 2025-11-25 19:27:00.585 187216 DEBUG nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:27:01 compute-0 nova_compute[187212]: 2025-11-25 19:27:01.100 187216 DEBUG nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:27:01 compute-0 nova_compute[187212]: 2025-11-25 19:27:01.100 187216 DEBUG nova.network.neutron [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:27:01 compute-0 nova_compute[187212]: 2025-11-25 19:27:01.101 187216 WARNING neutronclient.v2_0.client [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:27:01 compute-0 nova_compute[187212]: 2025-11-25 19:27:01.101 187216 WARNING neutronclient.v2_0.client [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:27:01 compute-0 podman[216452]: 2025-11-25 19:27:01.180754387 +0000 UTC m=+0.102781105 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:27:01 compute-0 openstack_network_exporter[199731]: ERROR   19:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:27:01 compute-0 openstack_network_exporter[199731]: ERROR   19:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:27:01 compute-0 openstack_network_exporter[199731]: ERROR   19:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:27:01 compute-0 openstack_network_exporter[199731]: ERROR   19:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:27:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:27:01 compute-0 openstack_network_exporter[199731]: ERROR   19:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:27:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:27:01 compute-0 nova_compute[187212]: 2025-11-25 19:27:01.613 187216 INFO nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:27:01 compute-0 nova_compute[187212]: 2025-11-25 19:27:01.670 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:02 compute-0 nova_compute[187212]: 2025-11-25 19:27:02.123 187216 DEBUG nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.145 187216 DEBUG nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.146 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.147 187216 INFO nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Creating image(s)
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.148 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "/var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.149 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "/var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.150 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "/var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.150 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.156 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.158 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:03 compute-0 podman[216473]: 2025-11-25 19:27:03.182640789 +0000 UTC m=+0.094841616 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.238 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.239 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.240 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.241 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.247 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.248 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.319 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.320 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.371 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.373 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.373 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.452 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.454 187216 DEBUG nova.virt.disk.api [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Checking if we can resize image /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.455 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.539 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.541 187216 DEBUG nova.virt.disk.api [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Cannot resize image /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.541 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.542 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Ensure instance console log exists: /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.543 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.543 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.544 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:03 compute-0 nova_compute[187212]: 2025-11-25 19:27:03.902 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:04 compute-0 nova_compute[187212]: 2025-11-25 19:27:04.049 187216 DEBUG nova.network.neutron [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Successfully created port: bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:27:04 compute-0 nova_compute[187212]: 2025-11-25 19:27:04.876 187216 DEBUG nova.network.neutron [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Successfully updated port: bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:27:04 compute-0 nova_compute[187212]: 2025-11-25 19:27:04.960 187216 DEBUG nova.compute.manager [req-b56c9b56-fbe5-413f-9454-919dd7c393a6 req-858e83cf-44cf-4c6a-9d4d-081d3bc4a994 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Received event network-changed-bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:27:04 compute-0 nova_compute[187212]: 2025-11-25 19:27:04.960 187216 DEBUG nova.compute.manager [req-b56c9b56-fbe5-413f-9454-919dd7c393a6 req-858e83cf-44cf-4c6a-9d4d-081d3bc4a994 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Refreshing instance network info cache due to event network-changed-bfcb9d3b-425e-4d5f-b3bf-25ff4655b093. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:27:04 compute-0 nova_compute[187212]: 2025-11-25 19:27:04.961 187216 DEBUG oslo_concurrency.lockutils [req-b56c9b56-fbe5-413f-9454-919dd7c393a6 req-858e83cf-44cf-4c6a-9d4d-081d3bc4a994 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-f71d9429-2da3-4b6b-b82d-63027e46f952" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:27:04 compute-0 nova_compute[187212]: 2025-11-25 19:27:04.961 187216 DEBUG oslo_concurrency.lockutils [req-b56c9b56-fbe5-413f-9454-919dd7c393a6 req-858e83cf-44cf-4c6a-9d4d-081d3bc4a994 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-f71d9429-2da3-4b6b-b82d-63027e46f952" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:27:04 compute-0 nova_compute[187212]: 2025-11-25 19:27:04.961 187216 DEBUG nova.network.neutron [req-b56c9b56-fbe5-413f-9454-919dd7c393a6 req-858e83cf-44cf-4c6a-9d4d-081d3bc4a994 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Refreshing network info cache for port bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:27:05 compute-0 nova_compute[187212]: 2025-11-25 19:27:05.395 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "refresh_cache-f71d9429-2da3-4b6b-b82d-63027e46f952" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:27:05 compute-0 nova_compute[187212]: 2025-11-25 19:27:05.534 187216 WARNING neutronclient.v2_0.client [req-b56c9b56-fbe5-413f-9454-919dd7c393a6 req-858e83cf-44cf-4c6a-9d4d-081d3bc4a994 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:27:05 compute-0 nova_compute[187212]: 2025-11-25 19:27:05.688 187216 DEBUG nova.network.neutron [req-b56c9b56-fbe5-413f-9454-919dd7c393a6 req-858e83cf-44cf-4c6a-9d4d-081d3bc4a994 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:27:05 compute-0 nova_compute[187212]: 2025-11-25 19:27:05.948 187216 DEBUG nova.network.neutron [req-b56c9b56-fbe5-413f-9454-919dd7c393a6 req-858e83cf-44cf-4c6a-9d4d-081d3bc4a994 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:27:06 compute-0 nova_compute[187212]: 2025-11-25 19:27:06.467 187216 DEBUG oslo_concurrency.lockutils [req-b56c9b56-fbe5-413f-9454-919dd7c393a6 req-858e83cf-44cf-4c6a-9d4d-081d3bc4a994 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-f71d9429-2da3-4b6b-b82d-63027e46f952" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:27:06 compute-0 nova_compute[187212]: 2025-11-25 19:27:06.468 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquired lock "refresh_cache-f71d9429-2da3-4b6b-b82d-63027e46f952" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:27:06 compute-0 nova_compute[187212]: 2025-11-25 19:27:06.468 187216 DEBUG nova.network.neutron [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:27:06 compute-0 nova_compute[187212]: 2025-11-25 19:27:06.673 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:07 compute-0 nova_compute[187212]: 2025-11-25 19:27:07.696 187216 DEBUG nova.network.neutron [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:27:08 compute-0 nova_compute[187212]: 2025-11-25 19:27:08.669 187216 WARNING neutronclient.v2_0.client [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:27:08 compute-0 nova_compute[187212]: 2025-11-25 19:27:08.952 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:09 compute-0 nova_compute[187212]: 2025-11-25 19:27:09.221 187216 DEBUG nova.network.neutron [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Updating instance_info_cache with network_info: [{"id": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "address": "fa:16:3e:97:17:61", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcb9d3b-42", "ovs_interfaceid": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.163 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Releasing lock "refresh_cache-f71d9429-2da3-4b6b-b82d-63027e46f952" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.164 187216 DEBUG nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Instance network_info: |[{"id": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "address": "fa:16:3e:97:17:61", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcb9d3b-42", "ovs_interfaceid": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.168 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Start _get_guest_xml network_info=[{"id": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "address": "fa:16:3e:97:17:61", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcb9d3b-42", "ovs_interfaceid": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.173 187216 WARNING nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.175 187216 DEBUG nova.virt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-428471061', uuid='f71d9429-2da3-4b6b-b82d-63027e46f952'), owner=OwnerMeta(userid='e87bb944d08a433ca7ecc2309e015e24', username='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin', projectid='e0287f0353d44a63af6cafda5ee0aa0c', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "address": "fa:16:3e:97:17:61", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcb9d3b-42", "ovs_interfaceid": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764098831.1756954) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.185 187216 DEBUG nova.virt.libvirt.host [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.185 187216 DEBUG nova.virt.libvirt.host [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.195 187216 DEBUG nova.virt.libvirt.host [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.196 187216 DEBUG nova.virt.libvirt.host [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.197 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.197 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.198 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.199 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.199 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.199 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.200 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.200 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.200 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.201 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.201 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.202 187216 DEBUG nova.virt.hardware [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.207 187216 DEBUG nova.virt.libvirt.vif [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:26:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-428471061',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-428',id=19,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e0287f0353d44a63af6cafda5ee0aa0c',ramdisk_id='',reservation_id='r-zcbq4705',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:27:02Z,user_data=None,user_id='e87bb944d08a433ca7ecc2309e015e24',uuid=f71d9429-2da3-4b6b-b82d-63027e46f952,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "address": "fa:16:3e:97:17:61", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcb9d3b-42", "ovs_interfaceid": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.208 187216 DEBUG nova.network.os_vif_util [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Converting VIF {"id": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "address": "fa:16:3e:97:17:61", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcb9d3b-42", "ovs_interfaceid": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.209 187216 DEBUG nova.network.os_vif_util [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:17:61,bridge_name='br-int',has_traffic_filtering=True,id=bfcb9d3b-425e-4d5f-b3bf-25ff4655b093,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcb9d3b-42') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.210 187216 DEBUG nova.objects.instance [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lazy-loading 'pci_devices' on Instance uuid f71d9429-2da3-4b6b-b82d-63027e46f952 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.675 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.737 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <uuid>f71d9429-2da3-4b6b-b82d-63027e46f952</uuid>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <name>instance-00000013</name>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-428471061</nova:name>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:27:11</nova:creationTime>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:27:11 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:27:11 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:user uuid="e87bb944d08a433ca7ecc2309e015e24">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin</nova:user>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:project uuid="e0287f0353d44a63af6cafda5ee0aa0c">tempest-TestExecuteNodeResourceConsolidationStrategy-641830825</nova:project>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         <nova:port uuid="bfcb9d3b-425e-4d5f-b3bf-25ff4655b093">
Nov 25 19:27:11 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <system>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <entry name="serial">f71d9429-2da3-4b6b-b82d-63027e46f952</entry>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <entry name="uuid">f71d9429-2da3-4b6b-b82d-63027e46f952</entry>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     </system>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <os>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   </os>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <features>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   </features>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk.config"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:97:17:61"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <target dev="tapbfcb9d3b-42"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/console.log" append="off"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <video>
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     </video>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:27:11 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:27:11 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:27:11 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:27:11 compute-0 nova_compute[187212]: </domain>
Nov 25 19:27:11 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.738 187216 DEBUG nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Preparing to wait for external event network-vif-plugged-bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.739 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Acquiring lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.739 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.739 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.741 187216 DEBUG nova.virt.libvirt.vif [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:26:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-428471061',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-428',id=19,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e0287f0353d44a63af6cafda5ee0aa0c',ramdisk_id='',reservation_id='r-zcbq4705',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-641830825-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:27:02Z,user_data=None,user_id='e87bb944d08a433ca7ecc2309e015e24',uuid=f71d9429-2da3-4b6b-b82d-63027e46f952,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "address": "fa:16:3e:97:17:61", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcb9d3b-42", "ovs_interfaceid": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.741 187216 DEBUG nova.network.os_vif_util [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Converting VIF {"id": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "address": "fa:16:3e:97:17:61", "network": {"id": "1d90bb72-93e5-4ff5-baa5-d0e187ade418", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-721583290-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b3c9934abb6540418711f0a3d8d13862", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcb9d3b-42", "ovs_interfaceid": "bfcb9d3b-425e-4d5f-b3bf-25ff4655b093", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.742 187216 DEBUG nova.network.os_vif_util [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:17:61,bridge_name='br-int',has_traffic_filtering=True,id=bfcb9d3b-425e-4d5f-b3bf-25ff4655b093,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcb9d3b-42') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.743 187216 DEBUG os_vif [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:17:61,bridge_name='br-int',has_traffic_filtering=True,id=bfcb9d3b-425e-4d5f-b3bf-25ff4655b093,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcb9d3b-42') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.744 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.745 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.745 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.746 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.747 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5fd108f2-86b8-5a7a-8cdc-f6cffe218fe2', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.749 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.751 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.754 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.754 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfcb9d3b-42, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.755 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbfcb9d3b-42, col_values=(('qos', UUID('fb2ac461-5509-4854-82d6-f71209d79320')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.755 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbfcb9d3b-42, col_values=(('external_ids', {'iface-id': 'bfcb9d3b-425e-4d5f-b3bf-25ff4655b093', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:17:61', 'vm-uuid': 'f71d9429-2da3-4b6b-b82d-63027e46f952'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.757 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:11 compute-0 NetworkManager[55552]: <info>  [1764098831.7589] manager: (tapbfcb9d3b-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.759 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.766 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:11 compute-0 nova_compute[187212]: 2025-11-25 19:27:11.767 187216 INFO os_vif [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:17:61,bridge_name='br-int',has_traffic_filtering=True,id=bfcb9d3b-425e-4d5f-b3bf-25ff4655b093,network=Network(1d90bb72-93e5-4ff5-baa5-d0e187ade418),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcb9d3b-42')
Nov 25 19:27:13 compute-0 nova_compute[187212]: 2025-11-25 19:27:13.693 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:27:13 compute-0 nova_compute[187212]: 2025-11-25 19:27:13.695 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:27:13 compute-0 nova_compute[187212]: 2025-11-25 19:27:13.695 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] No VIF found with MAC fa:16:3e:97:17:61, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:27:13 compute-0 nova_compute[187212]: 2025-11-25 19:27:13.696 187216 INFO nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Using config drive
Nov 25 19:27:13 compute-0 nova_compute[187212]: 2025-11-25 19:27:13.954 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:14 compute-0 nova_compute[187212]: 2025-11-25 19:27:14.435 187216 WARNING neutronclient.v2_0.client [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:14.999 187216 INFO nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Creating config drive at /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk.config
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.009 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpeku03y03 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.154 187216 DEBUG oslo_concurrency.processutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpeku03y03" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:15 compute-0 kernel: tapbfcb9d3b-42: entered promiscuous mode
Nov 25 19:27:15 compute-0 ovn_controller[95465]: 2025-11-25T19:27:15Z|00160|binding|INFO|Claiming lport bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 for this chassis.
Nov 25 19:27:15 compute-0 ovn_controller[95465]: 2025-11-25T19:27:15Z|00161|binding|INFO|bfcb9d3b-425e-4d5f-b3bf-25ff4655b093: Claiming fa:16:3e:97:17:61 10.100.0.5
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.242 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:15 compute-0 NetworkManager[55552]: <info>  [1764098835.2457] manager: (tapbfcb9d3b-42): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Nov 25 19:27:15 compute-0 ovn_controller[95465]: 2025-11-25T19:27:15Z|00162|binding|INFO|Setting lport bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 ovn-installed in OVS
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.267 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.271 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:15 compute-0 systemd-udevd[216528]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:27:15 compute-0 systemd-machined[153494]: New machine qemu-15-instance-00000013.
Nov 25 19:27:15 compute-0 NetworkManager[55552]: <info>  [1764098835.3156] device (tapbfcb9d3b-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:27:15 compute-0 NetworkManager[55552]: <info>  [1764098835.3172] device (tapbfcb9d3b-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:27:15 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000013.
Nov 25 19:27:15 compute-0 ovn_controller[95465]: 2025-11-25T19:27:15Z|00163|binding|INFO|Setting lport bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 up in Southbound
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.354 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:17:61 10.100.0.5'], port_security=['fa:16:3e:97:17:61 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f71d9429-2da3-4b6b-b82d-63027e46f952', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e0287f0353d44a63af6cafda5ee0aa0c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b7ddaac2-ed9c-4646-93a4-964aad68db2c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e071529-0293-4440-9c70-07d9694c0383, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=bfcb9d3b-425e-4d5f-b3bf-25ff4655b093) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.355 104356 INFO neutron.agent.ovn.metadata.agent [-] Port bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 in datapath 1d90bb72-93e5-4ff5-baa5-d0e187ade418 bound to our chassis
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.357 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1d90bb72-93e5-4ff5-baa5-d0e187ade418
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.373 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[f2294f8a-a879-4b52-8b38-535975bf0dce]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.374 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1d90bb72-91 in ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.376 208756 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1d90bb72-90 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.377 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6b18ba-1c63-4613-83a0-73ecb39d5a84]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.378 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[18331c96-2734-403f-955c-d517f8b30f51]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.396 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba5aa29-d47b-4a36-b24a-a13b20caf18b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.417 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[22727c2a-c243-4bf8-bfbc-d6a8bc16e365]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.461 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[d9198c09-7aff-4d2c-b7c0-f5335e3ca88c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 systemd-udevd[216531]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:27:15 compute-0 NetworkManager[55552]: <info>  [1764098835.4702] manager: (tap1d90bb72-90): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.472 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[d1596f96-e471-4f57-9772-2497f3d37e7f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.517 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[5b83d687-69ed-4707-8044-2a8b1428a3b4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.521 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[8fcc1420-308e-4cbd-a387-8aad5c9c4ab4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 NetworkManager[55552]: <info>  [1764098835.5565] device (tap1d90bb72-90): carrier: link connected
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.566 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[3682115a-c5c1-4da9-b2a9-8fdce4edbe12]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.592 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[547ac8ef-37ee-48bc-ac16-8dfad5ec6e6a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d90bb72-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:44:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478322, 'reachable_time': 43584, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216562, 'error': None, 'target': 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.614 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[41ea2748-48e3-4fe0-b7c8-472991c8af59]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:44de'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 478322, 'tstamp': 478322}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216563, 'error': None, 'target': 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.639 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[1c68964c-cea1-4067-83fd-adbedc9140f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1d90bb72-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:44:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478322, 'reachable_time': 43584, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216564, 'error': None, 'target': 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.682 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[9101e275-7a50-4302-9d38-7232ffc39ba0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.771 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[63935ef0-f9f3-45a9-93a1-3902b3de1e36]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.773 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d90bb72-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.773 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.774 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d90bb72-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.776 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:15 compute-0 NetworkManager[55552]: <info>  [1764098835.7776] manager: (tap1d90bb72-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Nov 25 19:27:15 compute-0 kernel: tap1d90bb72-90: entered promiscuous mode
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.779 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.782 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1d90bb72-90, col_values=(('external_ids', {'iface-id': 'f3db0a73-6d5e-44f5-a754-565ad86befff'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.783 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.786 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:15 compute-0 ovn_controller[95465]: 2025-11-25T19:27:15Z|00164|binding|INFO|Releasing lport f3db0a73-6d5e-44f5-a754-565ad86befff from this chassis (sb_readonly=0)
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.808 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[ff45c4de-b779-4eb2-b599-08038766941f]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.809 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.810 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.810 104356 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 1d90bb72-93e5-4ff5-baa5-d0e187ade418 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.810 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:27:15 compute-0 nova_compute[187212]: 2025-11-25 19:27:15.812 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.811 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[05a2efb0-89f8-40c3-9f5d-3c62209ac8ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.813 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.813 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[46650aef-4b6d-4020-9ca9-9da2f789e7e3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.814 104356 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: global
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     log         /dev/log local0 debug
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     log-tag     haproxy-metadata-proxy-1d90bb72-93e5-4ff5-baa5-d0e187ade418
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     user        root
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     group       root
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     maxconn     1024
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     pidfile     /var/lib/neutron/external/pids/1d90bb72-93e5-4ff5-baa5-d0e187ade418.pid.haproxy
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     daemon
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: defaults
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     log global
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     mode http
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     option httplog
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     option dontlognull
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     option http-server-close
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     option forwardfor
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     retries                 3
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     timeout http-request    30s
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     timeout connect         30s
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     timeout client          32s
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     timeout server          32s
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     timeout http-keep-alive 30s
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: listen listener
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     bind 169.254.169.254:80
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:     http-request add-header X-OVN-Network-ID 1d90bb72-93e5-4ff5-baa5-d0e187ade418
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 19:27:15 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:15.815 104356 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'env', 'PROCESS_TAG=haproxy-1d90bb72-93e5-4ff5-baa5-d0e187ade418', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1d90bb72-93e5-4ff5-baa5-d0e187ade418.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Nov 25 19:27:16 compute-0 podman[216596]: 2025-11-25 19:27:16.329987017 +0000 UTC m=+0.082711749 container create 78e7aee9eab6cecaf49a2664e6453d1867def78e0ed910151c19246ae2dd786d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Nov 25 19:27:16 compute-0 systemd[1]: Started libpod-conmon-78e7aee9eab6cecaf49a2664e6453d1867def78e0ed910151c19246ae2dd786d.scope.
Nov 25 19:27:16 compute-0 podman[216596]: 2025-11-25 19:27:16.287159384 +0000 UTC m=+0.039884127 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 19:27:16 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:27:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0515188c50cd4b436ca3ac528ba8a0ea98cdb1a79e61796def0a034fbda52221/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 19:27:16 compute-0 podman[216596]: 2025-11-25 19:27:16.412166271 +0000 UTC m=+0.164891013 container init 78e7aee9eab6cecaf49a2664e6453d1867def78e0ed910151c19246ae2dd786d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Nov 25 19:27:16 compute-0 podman[216596]: 2025-11-25 19:27:16.422766138 +0000 UTC m=+0.175490830 container start 78e7aee9eab6cecaf49a2664e6453d1867def78e0ed910151c19246ae2dd786d (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:27:16 compute-0 neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418[216612]: [NOTICE]   (216633) : New worker (216641) forked
Nov 25 19:27:16 compute-0 neutron-haproxy-ovnmeta-1d90bb72-93e5-4ff5-baa5-d0e187ade418[216612]: [NOTICE]   (216633) : Loading success.
Nov 25 19:27:16 compute-0 podman[216609]: 2025-11-25 19:27:16.465914049 +0000 UTC m=+0.091427167 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:27:16 compute-0 nova_compute[187212]: 2025-11-25 19:27:16.758 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:16 compute-0 nova_compute[187212]: 2025-11-25 19:27:16.829 187216 DEBUG nova.compute.manager [req-21c76e36-1e73-433d-a969-26dd4d25e25d req-f5047575-7422-4add-9818-8b161cf1756a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Received event network-vif-plugged-bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:27:16 compute-0 nova_compute[187212]: 2025-11-25 19:27:16.830 187216 DEBUG oslo_concurrency.lockutils [req-21c76e36-1e73-433d-a969-26dd4d25e25d req-f5047575-7422-4add-9818-8b161cf1756a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:16 compute-0 nova_compute[187212]: 2025-11-25 19:27:16.830 187216 DEBUG oslo_concurrency.lockutils [req-21c76e36-1e73-433d-a969-26dd4d25e25d req-f5047575-7422-4add-9818-8b161cf1756a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:16 compute-0 nova_compute[187212]: 2025-11-25 19:27:16.830 187216 DEBUG oslo_concurrency.lockutils [req-21c76e36-1e73-433d-a969-26dd4d25e25d req-f5047575-7422-4add-9818-8b161cf1756a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:16 compute-0 nova_compute[187212]: 2025-11-25 19:27:16.830 187216 DEBUG nova.compute.manager [req-21c76e36-1e73-433d-a969-26dd4d25e25d req-f5047575-7422-4add-9818-8b161cf1756a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Processing event network-vif-plugged-bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.030 187216 DEBUG nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.035 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.038 187216 INFO nova.virt.libvirt.driver [-] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Instance spawned successfully.
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.039 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:27:17 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:17.267 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.268 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:17 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:17.268 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.622 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.623 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.625 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.626 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.627 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:27:17 compute-0 nova_compute[187212]: 2025-11-25 19:27:17.628 187216 DEBUG nova.virt.libvirt.driver [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:27:18 compute-0 nova_compute[187212]: 2025-11-25 19:27:18.215 187216 INFO nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Took 15.07 seconds to spawn the instance on the hypervisor.
Nov 25 19:27:18 compute-0 nova_compute[187212]: 2025-11-25 19:27:18.217 187216 DEBUG nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:27:18 compute-0 nova_compute[187212]: 2025-11-25 19:27:18.969 187216 DEBUG nova.compute.manager [req-5cf988aa-0b21-4f53-b19e-60cbf4ec7308 req-f61b9a79-847e-478e-866e-5202f7bfbe4a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Received event network-vif-plugged-bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:27:18 compute-0 nova_compute[187212]: 2025-11-25 19:27:18.970 187216 DEBUG oslo_concurrency.lockutils [req-5cf988aa-0b21-4f53-b19e-60cbf4ec7308 req-f61b9a79-847e-478e-866e-5202f7bfbe4a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:18 compute-0 nova_compute[187212]: 2025-11-25 19:27:18.970 187216 DEBUG oslo_concurrency.lockutils [req-5cf988aa-0b21-4f53-b19e-60cbf4ec7308 req-f61b9a79-847e-478e-866e-5202f7bfbe4a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:18 compute-0 nova_compute[187212]: 2025-11-25 19:27:18.971 187216 DEBUG oslo_concurrency.lockutils [req-5cf988aa-0b21-4f53-b19e-60cbf4ec7308 req-f61b9a79-847e-478e-866e-5202f7bfbe4a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:18 compute-0 nova_compute[187212]: 2025-11-25 19:27:18.971 187216 DEBUG nova.compute.manager [req-5cf988aa-0b21-4f53-b19e-60cbf4ec7308 req-f61b9a79-847e-478e-866e-5202f7bfbe4a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] No waiting events found dispatching network-vif-plugged-bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:27:18 compute-0 nova_compute[187212]: 2025-11-25 19:27:18.972 187216 WARNING nova.compute.manager [req-5cf988aa-0b21-4f53-b19e-60cbf4ec7308 req-f61b9a79-847e-478e-866e-5202f7bfbe4a 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Received unexpected event network-vif-plugged-bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 for instance with vm_state active and task_state None.
Nov 25 19:27:18 compute-0 nova_compute[187212]: 2025-11-25 19:27:18.997 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:19 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:19.270 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:27:20 compute-0 nova_compute[187212]: 2025-11-25 19:27:20.869 187216 INFO nova.compute.manager [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Took 22.48 seconds to build instance.
Nov 25 19:27:21 compute-0 nova_compute[187212]: 2025-11-25 19:27:21.762 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:21 compute-0 nova_compute[187212]: 2025-11-25 19:27:21.764 187216 DEBUG oslo_concurrency.lockutils [None req-2b87824b-e43d-4be1-b5a3-20288e6c4158 e87bb944d08a433ca7ecc2309e015e24 e0287f0353d44a63af6cafda5ee0aa0c - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.396s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:23 compute-0 nova_compute[187212]: 2025-11-25 19:27:23.999 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:24 compute-0 podman[216658]: 2025-11-25 19:27:24.236757317 +0000 UTC m=+0.145914815 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 19:27:26 compute-0 nova_compute[187212]: 2025-11-25 19:27:26.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:27:26 compute-0 nova_compute[187212]: 2025-11-25 19:27:26.177 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:27:26 compute-0 nova_compute[187212]: 2025-11-25 19:27:26.767 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:27 compute-0 podman[216684]: 2025-11-25 19:27:27.163076846 +0000 UTC m=+0.085881902 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 19:27:29 compute-0 nova_compute[187212]: 2025-11-25 19:27:29.030 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:29 compute-0 podman[197585]: time="2025-11-25T19:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:27:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:27:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3073 "" "Go-http-client/1.1"
Nov 25 19:27:29 compute-0 ovn_controller[95465]: 2025-11-25T19:27:29Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:97:17:61 10.100.0.5
Nov 25 19:27:29 compute-0 ovn_controller[95465]: 2025-11-25T19:27:29Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:97:17:61 10.100.0.5
Nov 25 19:27:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:31.108 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:31.109 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:27:31.109 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:31 compute-0 nova_compute[187212]: 2025-11-25 19:27:31.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:27:31 compute-0 nova_compute[187212]: 2025-11-25 19:27:31.176 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:27:31 compute-0 openstack_network_exporter[199731]: ERROR   19:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:27:31 compute-0 openstack_network_exporter[199731]: ERROR   19:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:27:31 compute-0 openstack_network_exporter[199731]: ERROR   19:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:27:31 compute-0 openstack_network_exporter[199731]: ERROR   19:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:27:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:27:31 compute-0 openstack_network_exporter[199731]: ERROR   19:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:27:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:27:31 compute-0 nova_compute[187212]: 2025-11-25 19:27:31.770 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:32 compute-0 nova_compute[187212]: 2025-11-25 19:27:32.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:27:32 compute-0 podman[216715]: 2025-11-25 19:27:32.176792468 +0000 UTC m=+0.095513604 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 19:27:32 compute-0 nova_compute[187212]: 2025-11-25 19:27:32.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:32 compute-0 nova_compute[187212]: 2025-11-25 19:27:32.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:32 compute-0 nova_compute[187212]: 2025-11-25 19:27:32.695 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:32 compute-0 nova_compute[187212]: 2025-11-25 19:27:32.695 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:27:33 compute-0 nova_compute[187212]: 2025-11-25 19:27:33.850 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:33 compute-0 nova_compute[187212]: 2025-11-25 19:27:33.947 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:33 compute-0 nova_compute[187212]: 2025-11-25 19:27:33.949 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.033 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.037 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.069 187216 DEBUG nova.virt.libvirt.driver [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Check if temp file /var/lib/nova/instances/tmpgrmwfoha exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.076 187216 DEBUG nova.compute.manager [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgrmwfoha',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f71d9429-2da3-4b6b-b82d-63027e46f952',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Nov 25 19:27:34 compute-0 podman[216746]: 2025-11-25 19:27:34.168549635 +0000 UTC m=+0.085324837 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.243 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.245 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.287 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.288 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5651MB free_disk=72.96321487426758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.289 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:34 compute-0 nova_compute[187212]: 2025-11-25 19:27:34.289 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:35 compute-0 nova_compute[187212]: 2025-11-25 19:27:35.863 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:27:35 compute-0 nova_compute[187212]: 2025-11-25 19:27:35.864 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:27:35 compute-0 nova_compute[187212]: 2025-11-25 19:27:35.864 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:27:34 up  1:20,  0 user,  load average: 0.32, 0.33, 0.41\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:27:35 compute-0 nova_compute[187212]: 2025-11-25 19:27:35.966 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:27:36 compute-0 nova_compute[187212]: 2025-11-25 19:27:36.482 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:27:36 compute-0 nova_compute[187212]: 2025-11-25 19:27:36.772 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:37 compute-0 nova_compute[187212]: 2025-11-25 19:27:37.000 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:27:37 compute-0 nova_compute[187212]: 2025-11-25 19:27:37.000 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.711s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.001 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.001 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.513 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.513 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.549 187216 DEBUG oslo_concurrency.processutils [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.642 187216 DEBUG oslo_concurrency.processutils [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.644 187216 DEBUG oslo_concurrency.processutils [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.738 187216 DEBUG oslo_concurrency.processutils [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.741 187216 DEBUG nova.compute.manager [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Preparing to wait for external event network-vif-plugged-bfcb9d3b-425e-4d5f-b3bf-25ff4655b093 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.741 187216 DEBUG oslo_concurrency.lockutils [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.742 187216 DEBUG oslo_concurrency.lockutils [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:27:38 compute-0 nova_compute[187212]: 2025-11-25 19:27:38.742 187216 DEBUG oslo_concurrency.lockutils [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:27:39 compute-0 nova_compute[187212]: 2025-11-25 19:27:39.068 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:41 compute-0 nova_compute[187212]: 2025-11-25 19:27:41.775 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:43 compute-0 nova_compute[187212]: 2025-11-25 19:27:43.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:27:44 compute-0 nova_compute[187212]: 2025-11-25 19:27:44.070 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:45 compute-0 ovn_controller[95465]: 2025-11-25T19:27:45Z|00165|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Nov 25 19:27:46 compute-0 nova_compute[187212]: 2025-11-25 19:27:46.778 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:47 compute-0 podman[216774]: 2025-11-25 19:27:47.156679161 +0000 UTC m=+0.079717770 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:27:49 compute-0 nova_compute[187212]: 2025-11-25 19:27:49.115 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:51 compute-0 nova_compute[187212]: 2025-11-25 19:27:51.780 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:54 compute-0 nova_compute[187212]: 2025-11-25 19:27:54.116 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:55 compute-0 podman[216799]: 2025-11-25 19:27:55.186766011 +0000 UTC m=+0.104243273 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:27:56 compute-0 nova_compute[187212]: 2025-11-25 19:27:56.783 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:58 compute-0 podman[216825]: 2025-11-25 19:27:58.136878152 +0000 UTC m=+0.063374922 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.4)
Nov 25 19:27:59 compute-0 nova_compute[187212]: 2025-11-25 19:27:59.153 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:27:59 compute-0 podman[197585]: time="2025-11-25T19:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:27:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:27:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3080 "" "Go-http-client/1.1"
Nov 25 19:28:01 compute-0 openstack_network_exporter[199731]: ERROR   19:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:28:01 compute-0 openstack_network_exporter[199731]: ERROR   19:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:28:01 compute-0 openstack_network_exporter[199731]: ERROR   19:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:28:01 compute-0 openstack_network_exporter[199731]: ERROR   19:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:28:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:28:01 compute-0 openstack_network_exporter[199731]: ERROR   19:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:28:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:28:01 compute-0 nova_compute[187212]: 2025-11-25 19:28:01.787 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:03 compute-0 podman[216846]: 2025-11-25 19:28:03.159994021 +0000 UTC m=+0.077790560 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Nov 25 19:28:04 compute-0 nova_compute[187212]: 2025-11-25 19:28:04.154 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:05 compute-0 podman[216867]: 2025-11-25 19:28:05.1697344 +0000 UTC m=+0.087173266 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.4)
Nov 25 19:28:06 compute-0 nova_compute[187212]: 2025-11-25 19:28:06.791 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:09 compute-0 nova_compute[187212]: 2025-11-25 19:28:09.189 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:11 compute-0 nova_compute[187212]: 2025-11-25 19:28:11.812 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:14 compute-0 nova_compute[187212]: 2025-11-25 19:28:14.190 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:14 compute-0 sshd-session[216888]: Invalid user admin from 209.38.103.174 port 41896
Nov 25 19:28:14 compute-0 sshd-session[216888]: Connection closed by invalid user admin 209.38.103.174 port 41896 [preauth]
Nov 25 19:28:16 compute-0 nova_compute[187212]: 2025-11-25 19:28:16.815 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:18 compute-0 podman[216893]: 2025-11-25 19:28:18.148156009 +0000 UTC m=+0.065745664 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:28:19 compute-0 nova_compute[187212]: 2025-11-25 19:28:19.193 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:21 compute-0 nova_compute[187212]: 2025-11-25 19:28:21.817 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:24 compute-0 nova_compute[187212]: 2025-11-25 19:28:24.195 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:26 compute-0 podman[216918]: 2025-11-25 19:28:26.208836142 +0000 UTC m=+0.133621213 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 25 19:28:26 compute-0 nova_compute[187212]: 2025-11-25 19:28:26.819 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:27 compute-0 nova_compute[187212]: 2025-11-25 19:28:27.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:28:27 compute-0 nova_compute[187212]: 2025-11-25 19:28:27.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:28:29 compute-0 podman[216944]: 2025-11-25 19:28:29.143457499 +0000 UTC m=+0.072350607 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 19:28:29 compute-0 nova_compute[187212]: 2025-11-25 19:28:29.218 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:29 compute-0 podman[197585]: time="2025-11-25T19:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:28:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:28:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3087 "" "Go-http-client/1.1"
Nov 25 19:28:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:28:31.110 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:28:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:28:31.111 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:28:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:28:31.111 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:28:31 compute-0 nova_compute[187212]: 2025-11-25 19:28:31.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:28:31 compute-0 nova_compute[187212]: 2025-11-25 19:28:31.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:28:31 compute-0 openstack_network_exporter[199731]: ERROR   19:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:28:31 compute-0 openstack_network_exporter[199731]: ERROR   19:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:28:31 compute-0 openstack_network_exporter[199731]: ERROR   19:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:28:31 compute-0 openstack_network_exporter[199731]: ERROR   19:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:28:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:28:31 compute-0 openstack_network_exporter[199731]: ERROR   19:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:28:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:28:31 compute-0 nova_compute[187212]: 2025-11-25 19:28:31.851 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:34 compute-0 nova_compute[187212]: 2025-11-25 19:28:34.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:28:34 compute-0 podman[216964]: 2025-11-25 19:28:34.185804143 +0000 UTC m=+0.102482115 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, version=9.6)
Nov 25 19:28:34 compute-0 nova_compute[187212]: 2025-11-25 19:28:34.220 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:36 compute-0 podman[216986]: 2025-11-25 19:28:36.181078958 +0000 UTC m=+0.088879247 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 19:28:36 compute-0 nova_compute[187212]: 2025-11-25 19:28:36.853 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Pre live migration failed at compute-1.ctlplane.example.com: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 251718a316874e9c8552f35e1f0574eb
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Traceback (most recent call last):
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 9562, in _do_pre_live_migration_from_source
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]     migrate_data = self.compute_rpcapi.pre_live_migration(
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]   File "/usr/lib/python3.12/site-packages/nova/compute/rpcapi.py", line 949, in pre_live_migration
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]     return cctxt.call(ctxt, 'pre_live_migration',
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]     result = self.transport._send(
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]              ^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]     return self._driver.send(target, ctxt, message,
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]     return self._send(target, ctxt, message, wait_for_reply, timeout,
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 783, in _send
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]     result = self._waiter.wait(msg_id, timeout,
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 654, in wait
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]     message = self.waiters.get(msg_id, timeout=timeout)
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 520, in get
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952]     raise oslo_messaging.MessagingTimeout(
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 251718a316874e9c8552f35e1f0574eb
Nov 25 19:28:38 compute-0 nova_compute[187212]: 2025-11-25 19:28:38.876 187216 ERROR nova.compute.manager [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] 
Nov 25 19:28:39 compute-0 nova_compute[187212]: 2025-11-25 19:28:39.248 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:41 compute-0 nova_compute[187212]: 2025-11-25 19:28:41.918 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:44 compute-0 nova_compute[187212]: 2025-11-25 19:28:44.251 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:46 compute-0 nova_compute[187212]: 2025-11-25 19:28:46.921 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:49 compute-0 podman[217008]: 2025-11-25 19:28:49.169937895 +0000 UTC m=+0.083884306 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:28:49 compute-0 nova_compute[187212]: 2025-11-25 19:28:49.253 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:51 compute-0 nova_compute[187212]: 2025-11-25 19:28:51.965 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:52 compute-0 nova_compute[187212]: 2025-11-25 19:28:52.421 187216 INFO oslo_messaging._drivers.amqpdriver [-] No calling threads waiting for msg_id : 251718a316874e9c8552f35e1f0574eb
Nov 25 19:28:52 compute-0 nova_compute[187212]: 2025-11-25 19:28:52.921 187216 WARNING oslo.service.backend._eventlet.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 49.16 sec
Nov 25 19:28:52 compute-0 nova_compute[187212]: 2025-11-25 19:28:52.924 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:28:52 compute-0 nova_compute[187212]: 2025-11-25 19:28:52.925 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:28:52 compute-0 nova_compute[187212]: 2025-11-25 19:28:52.925 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:28:52 compute-0 nova_compute[187212]: 2025-11-25 19:28:52.926 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:28:53 compute-0 nova_compute[187212]: 2025-11-25 19:28:53.730 187216 WARNING nova.scheduler.client.report [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Failed to retrieve allocations for consumer 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362: {"errors": [{"status": 500, "title": "Internal Server Error", "detail": "The server has either erred or is incapable of performing the requested operation.\n\n (pymysql.err.OperationalError) (2003, \"Can't connect to MySQL server on 'openstack.openstack.svc' ([Errno 111] Connection refused)\") (Background on this error at: https://sqlalche.me/e/20/e3q8)  ", "code": "placement.undefined_code", "request_id": "req-c4ab1485-ed63-4170-80ad-cf3cadb94d2e"}]}: nova.exception.ConsumerAllocationRetrievalFailed: Failed to retrieve allocations for consumer 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362: {"errors": [{"status": 500, "title": "Internal Server Error", "detail": "The server has either erred or is incapable of performing the requested operation.\n\n (pymysql.err.OperationalError) (2003, \"Can't connect to MySQL server on 'openstack.openstack.svc' ([Errno 111] Connection refused)\") (Background on this error at: https://sqlalche.me/e/20/e3q8)  ", "code": "placement.undefined_code", "request_id": "req-c4ab1485-ed63-4170-80ad-cf3cadb94d2e"}]}
Nov 25 19:28:53 compute-0 nova_compute[187212]: 2025-11-25 19:28:53.731 187216 ERROR nova.compute.manager [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Did not find resource allocations for migration 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 on source node compute-0.ctlplane.example.com. Unable to revert source node allocations back to the instance.: oslo_messaging.exceptions.MessagingTimeout: Timed out waiting for a reply to message ID 251718a316874e9c8552f35e1f0574eb
Nov 25 19:28:53 compute-0 nova_compute[187212]: 2025-11-25 19:28:53.857 187216 INFO oslo_messaging._drivers.amqpdriver [-] No calling threads waiting for msg_id : 251718a316874e9c8552f35e1f0574eb
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.039 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.112 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.114 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.201 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.254 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.274 187216 DEBUG nova.objects.instance [None req-6067b781-a824-479f-850e-82adbf5b6e49 96430740f673482099709c323a75c916 105681f01c6e4422bb6b864118579069 - - default default] Lazy-loading 'migration_context' on Instance uuid f71d9429-2da3-4b6b-b82d-63027e46f952 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.430 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.431 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.456 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.457 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5656MB free_disk=72.96318435668945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.458 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:28:54 compute-0 nova_compute[187212]: 2025-11-25 19:28:54.459 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:28:55 compute-0 nova_compute[187212]: 2025-11-25 19:28:55.506 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] [instance: f71d9429-2da3-4b6b-b82d-63027e46f952] Updating resource usage from migration 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362
Nov 25 19:28:55 compute-0 nova_compute[187212]: 2025-11-25 19:28:55.745 187216 ERROR nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12: {"errors": [{"status": 500, "title": "Internal Server Error", "detail": "The server has either erred or is incapable of performing the requested operation.\n\n (pymysql.err.OperationalError) (2003, \"Can't connect to MySQL server on 'openstack.openstack.svc' ([Errno 111] Connection refused)\") (Background on this error at: https://sqlalche.me/e/20/e3q8)  ", "request_id": "req-e3e07b00-5de1-4fa9-8fdf-3d96aca582c4"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12: {"errors": [{"status": 500, "title": "Internal Server Error", "detail": "The server has either erred or is incapable of performing the requested operation.\n\n (pymysql.err.OperationalError) (2003, \"Can't connect to MySQL server on 'openstack.openstack.svc' ([Errno 111] Connection refused)\") (Background on this error at: https://sqlalche.me/e/20/e3q8)  ", "request_id": "req-e3e07b00-5de1-4fa9-8fdf-3d96aca582c4"}]}
Nov 25 19:28:55 compute-0 nova_compute[187212]: 2025-11-25 19:28:55.746 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:28:55 compute-0 nova_compute[187212]: 2025-11-25 19:28:55.746 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:28:54 up  1:21,  0 user,  load average: 0.11, 0.27, 0.38\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.101 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.643s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Error updating resources for node compute-0.ctlplane.example.com.: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12: {"errors": [{"status": 500, "title": "Internal Server Error", "detail": "The server has either erred or is incapable of performing the requested operation.\n\n (pymysql.err.OperationalError) (2003, \"Can't connect to MySQL server on 'openstack.openstack.svc' ([Errno 111] Connection refused)\") (Background on this error at: https://sqlalche.me/e/20/e3q8)  ", "request_id": "req-5639281d-198a-4d7a-8498-c31479db08ab"}]}
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager Traceback (most recent call last):
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 11268, in _update_available_resource_for_node
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     self.rt.update_available_resource(context, nodename,
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py", line 965, in update_available_resource
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     self._update_available_resource(context, resources, startup=startup)
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py", line 415, in inner
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     return f(*args, **kwargs)
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager            ^^^^^^^^^^^^^^^^^^
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py", line 1096, in _update_available_resource
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     self._update(context, cn, startup=startup)
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py", line 1408, in _update
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     self._update_to_placement(context, compute_node, startup)
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/retrying.py", line 49, in wrapped_f
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     return Retrying(*dargs, **dkw).call(f, *args, **kw)
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/retrying.py", line 206, in call
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     return attempt.get(self._wrap_exception)
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/retrying.py", line 247, in get
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     six.reraise(self.value[0], self.value[1], self.value[2])
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/six.py", line 719, in reraise
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     raise value
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/retrying.py", line 200, in call
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager                       ^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py", line 1317, in _update_to_placement
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     allocs = self.reportclient.get_allocations_for_provider_tree(
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/nova/scheduler/client/report.py", line 2313, in get_allocations_for_provider_tree
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     alloc_info = self.get_allocations_for_resource_provider(context, u)
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager   File "/usr/lib/python3.12/site-packages/nova/scheduler/client/report.py", line 2241, in get_allocations_for_resource_provider
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager     raise exception.ResourceProviderAllocationRetrievalFailed(
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12: {"errors": [{"status": 500, "title": "Internal Server Error", "detail": "The server has either erred or is incapable of performing the requested operation.\n\n (pymysql.err.OperationalError) (2003, \"Can't connect to MySQL server on 'openstack.openstack.svc' ([Errno 111] Connection refused)\") (Background on this error at: https://sqlalche.me/e/20/e3q8)  ", "request_id": "req-5639281d-198a-4d7a-8498-c31479db08ab"}]}
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.102 187216 ERROR nova.compute.manager 
Nov 25 19:28:56 compute-0 nova_compute[187212]: 2025-11-25 19:28:56.969 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:57 compute-0 nova_compute[187212]: 2025-11-25 19:28:57.105 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:28:57 compute-0 nova_compute[187212]: 2025-11-25 19:28:57.106 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:28:57 compute-0 nova_compute[187212]: 2025-11-25 19:28:57.106 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:28:57 compute-0 nova_compute[187212]: 2025-11-25 19:28:57.107 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:28:57 compute-0 podman[217042]: 2025-11-25 19:28:57.218034554 +0000 UTC m=+0.133244933 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Nov 25 19:28:59 compute-0 nova_compute[187212]: 2025-11-25 19:28:59.309 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:28:59 compute-0 podman[197585]: time="2025-11-25T19:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:28:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:28:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3085 "" "Go-http-client/1.1"
Nov 25 19:29:00 compute-0 podman[217072]: 2025-11-25 19:29:00.169238715 +0000 UTC m=+0.089681248 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 19:29:01 compute-0 openstack_network_exporter[199731]: ERROR   19:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:29:01 compute-0 openstack_network_exporter[199731]: ERROR   19:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:29:01 compute-0 openstack_network_exporter[199731]: ERROR   19:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:29:01 compute-0 openstack_network_exporter[199731]: ERROR   19:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:29:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:29:01 compute-0 openstack_network_exporter[199731]: ERROR   19:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:29:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:29:02 compute-0 nova_compute[187212]: 2025-11-25 19:29:02.009 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:04 compute-0 nova_compute[187212]: 2025-11-25 19:29:04.310 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:04 compute-0 podman[217093]: 2025-11-25 19:29:04.449644405 +0000 UTC m=+0.071390638 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Nov 25 19:29:07 compute-0 nova_compute[187212]: 2025-11-25 19:29:07.012 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:07 compute-0 podman[217118]: 2025-11-25 19:29:07.170251565 +0000 UTC m=+0.091630589 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 25 19:29:09 compute-0 sshd-session[217138]: Invalid user admin from 209.38.103.174 port 56632
Nov 25 19:29:09 compute-0 nova_compute[187212]: 2025-11-25 19:29:09.314 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:09 compute-0 sshd-session[217138]: Connection closed by invalid user admin 209.38.103.174 port 56632 [preauth]
Nov 25 19:29:12 compute-0 nova_compute[187212]: 2025-11-25 19:29:12.015 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:14 compute-0 nova_compute[187212]: 2025-11-25 19:29:14.316 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:17 compute-0 nova_compute[187212]: 2025-11-25 19:29:17.018 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:19 compute-0 nova_compute[187212]: 2025-11-25 19:29:19.319 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:20 compute-0 podman[217140]: 2025-11-25 19:29:20.193317342 +0000 UTC m=+0.112322704 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:29:22 compute-0 nova_compute[187212]: 2025-11-25 19:29:22.062 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:24 compute-0 nova_compute[187212]: 2025-11-25 19:29:24.372 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:27 compute-0 nova_compute[187212]: 2025-11-25 19:29:27.067 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:28 compute-0 nova_compute[187212]: 2025-11-25 19:29:28.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:28 compute-0 nova_compute[187212]: 2025-11-25 19:29:28.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:29:28 compute-0 podman[217163]: 2025-11-25 19:29:28.237320934 +0000 UTC m=+0.157761567 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:29:28 compute-0 nova_compute[187212]: 2025-11-25 19:29:28.419 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:29 compute-0 nova_compute[187212]: 2025-11-25 19:29:29.373 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:29 compute-0 podman[197585]: time="2025-11-25T19:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:29:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:29:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3085 "" "Go-http-client/1.1"
Nov 25 19:29:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:31.112 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:29:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:31.113 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:29:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:31.114 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:29:31 compute-0 podman[217190]: 2025-11-25 19:29:31.164677679 +0000 UTC m=+0.074341835 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true)
Nov 25 19:29:31 compute-0 openstack_network_exporter[199731]: ERROR   19:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:29:31 compute-0 openstack_network_exporter[199731]: ERROR   19:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:29:31 compute-0 openstack_network_exporter[199731]: ERROR   19:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:29:31 compute-0 openstack_network_exporter[199731]: ERROR   19:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:29:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:29:31 compute-0 openstack_network_exporter[199731]: ERROR   19:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:29:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:29:32 compute-0 nova_compute[187212]: 2025-11-25 19:29:32.091 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:33 compute-0 nova_compute[187212]: 2025-11-25 19:29:33.684 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:33 compute-0 nova_compute[187212]: 2025-11-25 19:29:33.686 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:34 compute-0 nova_compute[187212]: 2025-11-25 19:29:34.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:34 compute-0 nova_compute[187212]: 2025-11-25 19:29:34.416 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:34 compute-0 nova_compute[187212]: 2025-11-25 19:29:34.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:29:34 compute-0 nova_compute[187212]: 2025-11-25 19:29:34.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:29:34 compute-0 nova_compute[187212]: 2025-11-25 19:29:34.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:29:34 compute-0 nova_compute[187212]: 2025-11-25 19:29:34.695 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:29:35 compute-0 podman[217211]: 2025-11-25 19:29:35.16785619 +0000 UTC m=+0.084552293 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:29:35 compute-0 nova_compute[187212]: 2025-11-25 19:29:35.737 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:29:35 compute-0 nova_compute[187212]: 2025-11-25 19:29:35.797 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:29:35 compute-0 nova_compute[187212]: 2025-11-25 19:29:35.799 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:29:35 compute-0 nova_compute[187212]: 2025-11-25 19:29:35.855 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:29:36 compute-0 nova_compute[187212]: 2025-11-25 19:29:36.095 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:29:36 compute-0 nova_compute[187212]: 2025-11-25 19:29:36.097 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:29:36 compute-0 nova_compute[187212]: 2025-11-25 19:29:36.121 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:29:36 compute-0 nova_compute[187212]: 2025-11-25 19:29:36.122 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5650MB free_disk=72.96318435668945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:29:36 compute-0 nova_compute[187212]: 2025-11-25 19:29:36.123 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:29:36 compute-0 nova_compute[187212]: 2025-11-25 19:29:36.123 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:29:37 compute-0 nova_compute[187212]: 2025-11-25 19:29:37.095 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:37 compute-0 nova_compute[187212]: 2025-11-25 19:29:37.739 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:29:37 compute-0 nova_compute[187212]: 2025-11-25 19:29:37.740 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:29:37 compute-0 nova_compute[187212]: 2025-11-25 19:29:37.740 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:29:36 up  1:22,  0 user,  load average: 0.17, 0.26, 0.37\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:29:37 compute-0 nova_compute[187212]: 2025-11-25 19:29:37.849 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:29:38 compute-0 podman[217241]: 2025-11-25 19:29:38.14413018 +0000 UTC m=+0.063259404 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:29:38 compute-0 nova_compute[187212]: 2025-11-25 19:29:38.357 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:29:38 compute-0 nova_compute[187212]: 2025-11-25 19:29:38.868 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:29:38 compute-0 nova_compute[187212]: 2025-11-25 19:29:38.868 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.745s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:29:39 compute-0 nova_compute[187212]: 2025-11-25 19:29:39.464 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:39 compute-0 nova_compute[187212]: 2025-11-25 19:29:39.864 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:39 compute-0 nova_compute[187212]: 2025-11-25 19:29:39.865 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:40 compute-0 nova_compute[187212]: 2025-11-25 19:29:40.377 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:40 compute-0 nova_compute[187212]: 2025-11-25 19:29:40.378 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:42 compute-0 nova_compute[187212]: 2025-11-25 19:29:42.144 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:42 compute-0 nova_compute[187212]: 2025-11-25 19:29:42.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:42 compute-0 nova_compute[187212]: 2025-11-25 19:29:42.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:29:42 compute-0 nova_compute[187212]: 2025-11-25 19:29:42.681 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:29:44 compute-0 nova_compute[187212]: 2025-11-25 19:29:44.499 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:45 compute-0 nova_compute[187212]: 2025-11-25 19:29:45.413 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:45 compute-0 nova_compute[187212]: 2025-11-25 19:29:45.413 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:45 compute-0 nova_compute[187212]: 2025-11-25 19:29:45.924 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Triggering sync for uuid f71d9429-2da3-4b6b-b82d-63027e46f952 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Nov 25 19:29:45 compute-0 nova_compute[187212]: 2025-11-25 19:29:45.925 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "f71d9429-2da3-4b6b-b82d-63027e46f952" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:29:45 compute-0 nova_compute[187212]: 2025-11-25 19:29:45.926 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:29:46 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:46.082 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:29:46 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:46.083 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:29:46 compute-0 nova_compute[187212]: 2025-11-25 19:29:46.083 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:46 compute-0 nova_compute[187212]: 2025-11-25 19:29:46.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:46 compute-0 nova_compute[187212]: 2025-11-25 19:29:46.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:29:46 compute-0 nova_compute[187212]: 2025-11-25 19:29:46.442 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.516s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:29:47 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:47.085 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:29:47 compute-0 nova_compute[187212]: 2025-11-25 19:29:47.148 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:47 compute-0 nova_compute[187212]: 2025-11-25 19:29:47.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:29:48 compute-0 ovn_controller[95465]: 2025-11-25T19:29:48Z|00166|binding|INFO|Releasing lport f3db0a73-6d5e-44f5-a754-565ad86befff from this chassis (sb_readonly=0)
Nov 25 19:29:48 compute-0 nova_compute[187212]: 2025-11-25 19:29:48.664 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:49 compute-0 nova_compute[187212]: 2025-11-25 19:29:49.501 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:51 compute-0 podman[217266]: 2025-11-25 19:29:51.155768483 +0000 UTC m=+0.078268707 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:29:52 compute-0 nova_compute[187212]: 2025-11-25 19:29:52.151 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:54 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:54.048 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:f6:3a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9cd5f83030a746feb58b69fd4437cb54', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9e09c55-8331-4266-b41a-8ad7cac362a3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=62bed0d5-309f-407f-9857-408a2f143a2b) old=Port_Binding(mac=['fa:16:3e:81:f6:3a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9cd5f83030a746feb58b69fd4437cb54', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:29:54 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:54.049 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 62bed0d5-309f-407f-9857-408a2f143a2b in datapath 0eeee5bd-f568-4881-a684-2e2dd854c2e8 updated
Nov 25 19:29:54 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:54.051 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0eeee5bd-f568-4881-a684-2e2dd854c2e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:29:54 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:29:54.053 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[50135a39-44c5-4d2d-833e-63032dc92ec1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:29:54 compute-0 nova_compute[187212]: 2025-11-25 19:29:54.563 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:57 compute-0 nova_compute[187212]: 2025-11-25 19:29:57.152 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:59 compute-0 podman[217292]: 2025-11-25 19:29:59.21667623 +0000 UTC m=+0.138404429 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 25 19:29:59 compute-0 nova_compute[187212]: 2025-11-25 19:29:59.565 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:29:59 compute-0 podman[197585]: time="2025-11-25T19:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:29:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:29:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3086 "" "Go-http-client/1.1"
Nov 25 19:30:01 compute-0 openstack_network_exporter[199731]: ERROR   19:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:30:01 compute-0 openstack_network_exporter[199731]: ERROR   19:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:30:01 compute-0 openstack_network_exporter[199731]: ERROR   19:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:30:01 compute-0 openstack_network_exporter[199731]: ERROR   19:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:30:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:30:01 compute-0 openstack_network_exporter[199731]: ERROR   19:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:30:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:30:02 compute-0 nova_compute[187212]: 2025-11-25 19:30:02.155 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:02 compute-0 podman[217318]: 2025-11-25 19:30:02.17583669 +0000 UTC m=+0.088041455 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:30:03 compute-0 sshd-session[217337]: Invalid user admin from 209.38.103.174 port 51656
Nov 25 19:30:03 compute-0 sshd-session[217337]: Connection closed by invalid user admin 209.38.103.174 port 51656 [preauth]
Nov 25 19:30:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:04.089 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:cf:cc 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-baadc113-b6e9-40d1-ba83-c7c7fd974bf0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baadc113-b6e9-40d1-ba83-c7c7fd974bf0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3407615aeb074089a7b15fbc9f4e9578', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d8a79958-6f58-438b-8283-f99105974973, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f703f486-a53f-4105-bb4f-8058e04ac15d) old=Port_Binding(mac=['fa:16:3e:7b:cf:cc'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-baadc113-b6e9-40d1-ba83-c7c7fd974bf0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-baadc113-b6e9-40d1-ba83-c7c7fd974bf0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3407615aeb074089a7b15fbc9f4e9578', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:30:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:04.091 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f703f486-a53f-4105-bb4f-8058e04ac15d in datapath baadc113-b6e9-40d1-ba83-c7c7fd974bf0 updated
Nov 25 19:30:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:04.093 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network baadc113-b6e9-40d1-ba83-c7c7fd974bf0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:30:04 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:04.094 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c58edcb0-d505-4473-8d19-b01b42f3f3aa]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:04 compute-0 nova_compute[187212]: 2025-11-25 19:30:04.569 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:06 compute-0 podman[217339]: 2025-11-25 19:30:06.202237471 +0000 UTC m=+0.103916912 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Nov 25 19:30:07 compute-0 nova_compute[187212]: 2025-11-25 19:30:07.159 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:09 compute-0 podman[217360]: 2025-11-25 19:30:09.162967473 +0000 UTC m=+0.082347486 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 19:30:09 compute-0 nova_compute[187212]: 2025-11-25 19:30:09.608 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:12 compute-0 nova_compute[187212]: 2025-11-25 19:30:12.162 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:14 compute-0 nova_compute[187212]: 2025-11-25 19:30:14.610 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:17 compute-0 nova_compute[187212]: 2025-11-25 19:30:17.165 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:19 compute-0 nova_compute[187212]: 2025-11-25 19:30:19.613 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:22 compute-0 podman[217380]: 2025-11-25 19:30:22.149405907 +0000 UTC m=+0.064634910 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:30:22 compute-0 nova_compute[187212]: 2025-11-25 19:30:22.168 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:24 compute-0 nova_compute[187212]: 2025-11-25 19:30:24.644 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:27 compute-0 nova_compute[187212]: 2025-11-25 19:30:27.180 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:29 compute-0 nova_compute[187212]: 2025-11-25 19:30:29.647 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:29 compute-0 nova_compute[187212]: 2025-11-25 19:30:29.678 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:30:29 compute-0 nova_compute[187212]: 2025-11-25 19:30:29.679 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:30:29 compute-0 podman[197585]: time="2025-11-25T19:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:30:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:30:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3088 "" "Go-http-client/1.1"
Nov 25 19:30:30 compute-0 podman[217405]: 2025-11-25 19:30:30.212406688 +0000 UTC m=+0.135641947 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 25 19:30:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:31.115 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:31.115 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:31.116 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:31 compute-0 openstack_network_exporter[199731]: ERROR   19:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:30:31 compute-0 openstack_network_exporter[199731]: ERROR   19:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:30:31 compute-0 openstack_network_exporter[199731]: ERROR   19:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:30:31 compute-0 openstack_network_exporter[199731]: ERROR   19:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:30:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:30:31 compute-0 openstack_network_exporter[199731]: ERROR   19:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:30:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:30:32 compute-0 nova_compute[187212]: 2025-11-25 19:30:32.232 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:33 compute-0 podman[217434]: 2025-11-25 19:30:33.159922742 +0000 UTC m=+0.079571042 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 25 19:30:34 compute-0 nova_compute[187212]: 2025-11-25 19:30:34.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:30:34 compute-0 nova_compute[187212]: 2025-11-25 19:30:34.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:30:34 compute-0 nova_compute[187212]: 2025-11-25 19:30:34.694 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:36 compute-0 nova_compute[187212]: 2025-11-25 19:30:36.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:30:36 compute-0 nova_compute[187212]: 2025-11-25 19:30:36.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:30:36 compute-0 nova_compute[187212]: 2025-11-25 19:30:36.237 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "7b272b07-af4e-48a9-982b-25888fa2f334" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:36 compute-0 nova_compute[187212]: 2025-11-25 19:30:36.238 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:36 compute-0 nova_compute[187212]: 2025-11-25 19:30:36.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:36 compute-0 nova_compute[187212]: 2025-11-25 19:30:36.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:36 compute-0 nova_compute[187212]: 2025-11-25 19:30:36.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:36 compute-0 nova_compute[187212]: 2025-11-25 19:30:36.692 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:30:36 compute-0 nova_compute[187212]: 2025-11-25 19:30:36.854 187216 DEBUG nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Nov 25 19:30:37 compute-0 podman[217451]: 2025-11-25 19:30:37.200312141 +0000 UTC m=+0.119294927 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Nov 25 19:30:37 compute-0 nova_compute[187212]: 2025-11-25 19:30:37.235 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:37 compute-0 nova_compute[187212]: 2025-11-25 19:30:37.641 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:37 compute-0 nova_compute[187212]: 2025-11-25 19:30:37.642 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:37 compute-0 nova_compute[187212]: 2025-11-25 19:30:37.652 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Nov 25 19:30:37 compute-0 nova_compute[187212]: 2025-11-25 19:30:37.653 187216 INFO nova.compute.claims [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Claim successful on node compute-0.ctlplane.example.com
Nov 25 19:30:37 compute-0 nova_compute[187212]: 2025-11-25 19:30:37.778 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:30:37 compute-0 nova_compute[187212]: 2025-11-25 19:30:37.865 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:30:37 compute-0 nova_compute[187212]: 2025-11-25 19:30:37.866 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:30:37 compute-0 nova_compute[187212]: 2025-11-25 19:30:37.953 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.207 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.210 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.250 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.251 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5666MB free_disk=72.96318435668945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.251 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.823 187216 DEBUG nova.scheduler.client.report [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.873 187216 DEBUG nova.scheduler.client.report [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.873 187216 DEBUG nova.compute.provider_tree [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.894 187216 DEBUG nova.scheduler.client.report [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.929 187216 DEBUG nova.scheduler.client.report [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:30:38 compute-0 nova_compute[187212]: 2025-11-25 19:30:38.990 187216 DEBUG nova.compute.provider_tree [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:30:39 compute-0 nova_compute[187212]: 2025-11-25 19:30:39.579 187216 DEBUG nova.scheduler.client.report [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:30:39 compute-0 nova_compute[187212]: 2025-11-25 19:30:39.697 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:40 compute-0 nova_compute[187212]: 2025-11-25 19:30:40.141 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.499s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:40 compute-0 nova_compute[187212]: 2025-11-25 19:30:40.143 187216 DEBUG nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Nov 25 19:30:40 compute-0 nova_compute[187212]: 2025-11-25 19:30:40.146 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 1.895s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:40 compute-0 podman[217480]: 2025-11-25 19:30:40.187835765 +0000 UTC m=+0.102471604 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:30:40 compute-0 nova_compute[187212]: 2025-11-25 19:30:40.688 187216 DEBUG nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Nov 25 19:30:40 compute-0 nova_compute[187212]: 2025-11-25 19:30:40.689 187216 DEBUG nova.network.neutron [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Nov 25 19:30:40 compute-0 nova_compute[187212]: 2025-11-25 19:30:40.689 187216 WARNING neutronclient.v2_0.client [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:30:40 compute-0 nova_compute[187212]: 2025-11-25 19:30:40.690 187216 WARNING neutronclient.v2_0.client [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:30:41 compute-0 nova_compute[187212]: 2025-11-25 19:30:41.227 187216 INFO nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 25 19:30:41 compute-0 nova_compute[187212]: 2025-11-25 19:30:41.738 187216 DEBUG nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Nov 25 19:30:41 compute-0 nova_compute[187212]: 2025-11-25 19:30:41.835 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:30:41 compute-0 nova_compute[187212]: 2025-11-25 19:30:41.836 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:30:41 compute-0 nova_compute[187212]: 2025-11-25 19:30:41.837 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:30:41 compute-0 nova_compute[187212]: 2025-11-25 19:30:41.837 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:30:38 up  1:23,  0 user,  load average: 0.10, 0.23, 0.35\n', 'num_instances': '2', 'num_vm_building': '1', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '1', 'num_vm_active': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:30:41 compute-0 nova_compute[187212]: 2025-11-25 19:30:41.913 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.239 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.421 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.615 187216 DEBUG nova.network.neutron [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Successfully created port: 55822546-3c29-431d-b662-c78aac24c194 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.821 187216 DEBUG nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.822 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.823 187216 INFO nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Creating image(s)
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.824 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "/var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.824 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "/var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.825 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "/var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.825 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.829 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.832 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.925 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.926 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.927 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.927 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.930 187216 DEBUG oslo_utils.imageutils.format_inspector [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.931 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.940 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.941 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.795s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:42 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.998 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:42.999 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.044 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730,backing_fmt=raw /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.045 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "1c0eb12bfe5dbef092d49128b5539724adaa8730" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.046 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.106 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.107 187216 DEBUG nova.virt.disk.api [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Checking if we can resize image /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.108 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.181 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.182 187216 DEBUG nova.virt.disk.api [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Cannot resize image /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.183 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.183 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Ensure instance console log exists: /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.183 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.184 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.184 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.278 187216 DEBUG nova.network.neutron [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Successfully updated port: 55822546-3c29-431d-b662-c78aac24c194 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.363 187216 DEBUG nova.compute.manager [req-9f3f07c5-f6f1-4fba-b8d1-c76af4d31163 req-c572c0f0-3cbf-4a39-905c-bc71fb8295c1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Received event network-changed-55822546-3c29-431d-b662-c78aac24c194 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.364 187216 DEBUG nova.compute.manager [req-9f3f07c5-f6f1-4fba-b8d1-c76af4d31163 req-c572c0f0-3cbf-4a39-905c-bc71fb8295c1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Refreshing instance network info cache due to event network-changed-55822546-3c29-431d-b662-c78aac24c194. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.364 187216 DEBUG oslo_concurrency.lockutils [req-9f3f07c5-f6f1-4fba-b8d1-c76af4d31163 req-c572c0f0-3cbf-4a39-905c-bc71fb8295c1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "refresh_cache-7b272b07-af4e-48a9-982b-25888fa2f334" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.365 187216 DEBUG oslo_concurrency.lockutils [req-9f3f07c5-f6f1-4fba-b8d1-c76af4d31163 req-c572c0f0-3cbf-4a39-905c-bc71fb8295c1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquired lock "refresh_cache-7b272b07-af4e-48a9-982b-25888fa2f334" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.365 187216 DEBUG nova.network.neutron [req-9f3f07c5-f6f1-4fba-b8d1-c76af4d31163 req-c572c0f0-3cbf-4a39-905c-bc71fb8295c1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Refreshing network info cache for port 55822546-3c29-431d-b662-c78aac24c194 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.786 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "refresh_cache-7b272b07-af4e-48a9-982b-25888fa2f334" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.877 187216 WARNING neutronclient.v2_0.client [req-9f3f07c5-f6f1-4fba-b8d1-c76af4d31163 req-c572c0f0-3cbf-4a39-905c-bc71fb8295c1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.942 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.943 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:30:43 compute-0 nova_compute[187212]: 2025-11-25 19:30:43.946 187216 DEBUG nova.network.neutron [req-9f3f07c5-f6f1-4fba-b8d1-c76af4d31163 req-c572c0f0-3cbf-4a39-905c-bc71fb8295c1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:30:44 compute-0 nova_compute[187212]: 2025-11-25 19:30:44.701 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:44 compute-0 nova_compute[187212]: 2025-11-25 19:30:44.885 187216 DEBUG nova.network.neutron [req-9f3f07c5-f6f1-4fba-b8d1-c76af4d31163 req-c572c0f0-3cbf-4a39-905c-bc71fb8295c1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:30:45 compute-0 nova_compute[187212]: 2025-11-25 19:30:45.393 187216 DEBUG oslo_concurrency.lockutils [req-9f3f07c5-f6f1-4fba-b8d1-c76af4d31163 req-c572c0f0-3cbf-4a39-905c-bc71fb8295c1 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Releasing lock "refresh_cache-7b272b07-af4e-48a9-982b-25888fa2f334" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:30:45 compute-0 nova_compute[187212]: 2025-11-25 19:30:45.395 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquired lock "refresh_cache-7b272b07-af4e-48a9-982b-25888fa2f334" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Nov 25 19:30:45 compute-0 nova_compute[187212]: 2025-11-25 19:30:45.395 187216 DEBUG nova.network.neutron [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Nov 25 19:30:46 compute-0 nova_compute[187212]: 2025-11-25 19:30:46.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:30:46 compute-0 nova_compute[187212]: 2025-11-25 19:30:46.754 187216 DEBUG nova.network.neutron [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Nov 25 19:30:47 compute-0 nova_compute[187212]: 2025-11-25 19:30:47.241 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:47 compute-0 nova_compute[187212]: 2025-11-25 19:30:47.842 187216 WARNING neutronclient.v2_0.client [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:30:48 compute-0 nova_compute[187212]: 2025-11-25 19:30:48.739 187216 DEBUG nova.network.neutron [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Updating instance_info_cache with network_info: [{"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.246 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Releasing lock "refresh_cache-7b272b07-af4e-48a9-982b-25888fa2f334" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.247 187216 DEBUG nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Instance network_info: |[{"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.251 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Start _get_guest_xml network_info=[{"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': '5ca774a8-6150-424f-aaca-03ab3a3ee8cf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.257 187216 WARNING nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.260 187216 DEBUG nova.virt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2100065439', uuid='7b272b07-af4e-48a9-982b-25888fa2f334'), owner=OwnerMeta(userid='7e1e9cf32ad84b49a76e6a2fc6fe1c70', username='tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165-project-admin', projectid='3407615aeb074089a7b15fbc9f4e9578', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165'), image=ImageMeta(id='5ca774a8-6150-424f-aaca-03ab3a3ee8cf', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764099049.2602727) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.265 187216 DEBUG nova.virt.libvirt.host [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.266 187216 DEBUG nova.virt.libvirt.host [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.269 187216 DEBUG nova.virt.libvirt.host [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.270 187216 DEBUG nova.virt.libvirt.host [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.272 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.272 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-25T19:04:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='d7d5bae9-10ca-4750-9d69-ce73a869da56',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-25T19:04:42Z,direct_url=<?>,disk_format='qcow2',id=5ca774a8-6150-424f-aaca-03ab3a3ee8cf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8452218b0aa04a20a3969d637355f8c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-25T19:04:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.273 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.273 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.273 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.273 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.274 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.274 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.275 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.275 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.275 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.275 187216 DEBUG nova.virt.hardware [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.280 187216 DEBUG nova.virt.libvirt.vif [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:30:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2100065439',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2100065439',id=21,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3407615aeb074089a7b15fbc9f4e9578',ramdisk_id='',reservation_id='r-z7sedpdx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:30:41Z,user_data=None,user_id='7e1e9cf32ad84b49a76e6a2fc6fe1c70',uuid=7b272b07-af4e-48a9-982b-25888fa2f334,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.280 187216 DEBUG nova.network.os_vif_util [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Converting VIF {"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.281 187216 DEBUG nova.network.os_vif_util [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:93:6a,bridge_name='br-int',has_traffic_filtering=True,id=55822546-3c29-431d-b662-c78aac24c194,network=Network(0eeee5bd-f568-4881-a684-2e2dd854c2e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55822546-3c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.282 187216 DEBUG nova.objects.instance [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b272b07-af4e-48a9-982b-25888fa2f334 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.746 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.791 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] End _get_guest_xml xml=<domain type="kvm">
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <uuid>7b272b07-af4e-48a9-982b-25888fa2f334</uuid>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <name>instance-00000015</name>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <memory>131072</memory>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <vcpu>1</vcpu>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <metadata>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-2100065439</nova:name>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <nova:creationTime>2025-11-25 19:30:49</nova:creationTime>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <nova:flavor name="m1.nano" id="d7d5bae9-10ca-4750-9d69-ce73a869da56">
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:memory>128</nova:memory>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:disk>1</nova:disk>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:swap>0</nova:swap>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:ephemeral>0</nova:ephemeral>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:vcpus>1</nova:vcpus>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:extraSpecs>
Nov 25 19:30:49 compute-0 nova_compute[187212]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         </nova:extraSpecs>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       </nova:flavor>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <nova:image uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf">
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:containerFormat>bare</nova:containerFormat>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:diskFormat>qcow2</nova:diskFormat>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:minDisk>1</nova:minDisk>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:minRam>0</nova:minRam>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:properties>
Nov 25 19:30:49 compute-0 nova_compute[187212]:           <nova:property name="hw_rng_model">virtio</nova:property>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         </nova:properties>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       </nova:image>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <nova:owner>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:user uuid="7e1e9cf32ad84b49a76e6a2fc6fe1c70">tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165-project-admin</nova:user>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:project uuid="3407615aeb074089a7b15fbc9f4e9578">tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165</nova:project>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       </nova:owner>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <nova:root type="image" uuid="5ca774a8-6150-424f-aaca-03ab3a3ee8cf"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <nova:ports>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         <nova:port uuid="55822546-3c29-431d-b662-c78aac24c194">
Nov 25 19:30:49 compute-0 nova_compute[187212]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:         </nova:port>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       </nova:ports>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     </nova:instance>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   </metadata>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <sysinfo type="smbios">
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <system>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <entry name="manufacturer">RDO</entry>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <entry name="product">OpenStack Compute</entry>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <entry name="serial">7b272b07-af4e-48a9-982b-25888fa2f334</entry>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <entry name="uuid">7b272b07-af4e-48a9-982b-25888fa2f334</entry>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <entry name="family">Virtual Machine</entry>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     </system>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   </sysinfo>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <os>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <boot dev="hd"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <smbios mode="sysinfo"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   </os>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <features>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <acpi/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <apic/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <vmcoreinfo/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   </features>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <clock offset="utc">
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <timer name="pit" tickpolicy="delay"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <timer name="hpet" present="no"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   </clock>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <cpu mode="custom" match="exact">
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <model>Nehalem</model>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <topology sockets="1" cores="1" threads="1"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   </cpu>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   <devices>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <disk type="file" device="disk">
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <target dev="vda" bus="virtio"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <disk type="file" device="cdrom">
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <driver name="qemu" type="raw" cache="none"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <source file="/var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk.config"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <target dev="sda" bus="sata"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     </disk>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <interface type="ethernet">
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <mac address="fa:16:3e:fb:93:6a"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <driver name="vhost" rx_queue_size="512"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <mtu size="1442"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <target dev="tap55822546-3c"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     </interface>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <serial type="pty">
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <log file="/var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/console.log" append="off"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     </serial>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <video>
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <model type="virtio"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     </video>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <input type="tablet" bus="usb"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <rng model="virtio">
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <backend model="random">/dev/urandom</backend>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     </rng>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="pci" model="pcie-root-port"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <controller type="usb" index="0"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Nov 25 19:30:49 compute-0 nova_compute[187212]:       <stats period="10"/>
Nov 25 19:30:49 compute-0 nova_compute[187212]:     </memballoon>
Nov 25 19:30:49 compute-0 nova_compute[187212]:   </devices>
Nov 25 19:30:49 compute-0 nova_compute[187212]: </domain>
Nov 25 19:30:49 compute-0 nova_compute[187212]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.793 187216 DEBUG nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Preparing to wait for external event network-vif-plugged-55822546-3c29-431d-b662-c78aac24c194 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.794 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.794 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.795 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.796 187216 DEBUG nova.virt.libvirt.vif [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-11-25T19:30:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2100065439',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2100065439',id=21,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3407615aeb074089a7b15fbc9f4e9578',ramdisk_id='',reservation_id='r-z7sedpdx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-25T19:30:41Z,user_data=None,user_id='7e1e9cf32ad84b49a76e6a2fc6fe1c70',uuid=7b272b07-af4e-48a9-982b-25888fa2f334,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.797 187216 DEBUG nova.network.os_vif_util [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Converting VIF {"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.798 187216 DEBUG nova.network.os_vif_util [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:93:6a,bridge_name='br-int',has_traffic_filtering=True,id=55822546-3c29-431d-b662-c78aac24c194,network=Network(0eeee5bd-f568-4881-a684-2e2dd854c2e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55822546-3c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.799 187216 DEBUG os_vif [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:93:6a,bridge_name='br-int',has_traffic_filtering=True,id=55822546-3c29-431d-b662-c78aac24c194,network=Network(0eeee5bd-f568-4881-a684-2e2dd854c2e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55822546-3c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.800 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.801 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.801 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.802 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.803 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '65a3dfc8-4b51-5394-b0c4-c1c8f5892db9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.804 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.806 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.807 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.811 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.811 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55822546-3c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.812 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap55822546-3c, col_values=(('qos', UUID('800c22cc-db78-4f38-bf12-116cfa0f93d2')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.812 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap55822546-3c, col_values=(('external_ids', {'iface-id': '55822546-3c29-431d-b662-c78aac24c194', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fb:93:6a', 'vm-uuid': '7b272b07-af4e-48a9-982b-25888fa2f334'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.813 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:49 compute-0 NetworkManager[55552]: <info>  [1764099049.8147] manager: (tap55822546-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.816 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.825 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:49 compute-0 nova_compute[187212]: 2025-11-25 19:30:49.826 187216 INFO os_vif [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:93:6a,bridge_name='br-int',has_traffic_filtering=True,id=55822546-3c29-431d-b662-c78aac24c194,network=Network(0eeee5bd-f568-4881-a684-2e2dd854c2e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55822546-3c')
Nov 25 19:30:50 compute-0 ovn_controller[95465]: 2025-11-25T19:30:50Z|00167|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 25 19:30:51 compute-0 nova_compute[187212]: 2025-11-25 19:30:51.373 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:30:51 compute-0 nova_compute[187212]: 2025-11-25 19:30:51.374 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Nov 25 19:30:51 compute-0 nova_compute[187212]: 2025-11-25 19:30:51.374 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] No VIF found with MAC fa:16:3e:fb:93:6a, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Nov 25 19:30:51 compute-0 nova_compute[187212]: 2025-11-25 19:30:51.375 187216 INFO nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Using config drive
Nov 25 19:30:51 compute-0 nova_compute[187212]: 2025-11-25 19:30:51.888 187216 WARNING neutronclient.v2_0.client [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.336 187216 INFO nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Creating config drive at /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk.config
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.345 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpdaqio1cr execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.475 187216 DEBUG oslo_concurrency.processutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpdaqio1cr" returned: 0 in 0.130s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:30:52 compute-0 kernel: tap55822546-3c: entered promiscuous mode
Nov 25 19:30:52 compute-0 NetworkManager[55552]: <info>  [1764099052.5865] manager: (tap55822546-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Nov 25 19:30:52 compute-0 ovn_controller[95465]: 2025-11-25T19:30:52Z|00168|binding|INFO|Claiming lport 55822546-3c29-431d-b662-c78aac24c194 for this chassis.
Nov 25 19:30:52 compute-0 ovn_controller[95465]: 2025-11-25T19:30:52Z|00169|binding|INFO|55822546-3c29-431d-b662-c78aac24c194: Claiming fa:16:3e:fb:93:6a 10.100.0.8
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.594 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.609 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:93:6a 10.100.0.8'], port_security=['fa:16:3e:fb:93:6a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7b272b07-af4e-48a9-982b-25888fa2f334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3407615aeb074089a7b15fbc9f4e9578', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9f3e9396-3e8f-49e3-83aa-c3050a6612b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9e09c55-8331-4266-b41a-8ad7cac362a3, chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=55822546-3c29-431d-b662-c78aac24c194) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.610 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 55822546-3c29-431d-b662-c78aac24c194 in datapath 0eeee5bd-f568-4881-a684-2e2dd854c2e8 bound to our chassis
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.612 104356 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0eeee5bd-f568-4881-a684-2e2dd854c2e8
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.627 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[a52abbd1-a9b4-4a76-8e2f-583f8849144f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.628 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0eeee5bd-f1 in ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Nov 25 19:30:52 compute-0 systemd-udevd[217546]: Network interface NamePolicy= disabled on kernel command line.
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.630 208756 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0eeee5bd-f0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.630 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[0738bd22-8e3d-4f34-be41-084f7293499a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.631 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[987b228a-4b4c-4c77-aa9c-6e2fb4cb4baf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.648 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[1feab48b-9817-4fc9-b993-11e55fad9f5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 NetworkManager[55552]: <info>  [1764099052.6519] device (tap55822546-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 25 19:30:52 compute-0 NetworkManager[55552]: <info>  [1764099052.6532] device (tap55822546-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 25 19:30:52 compute-0 systemd-machined[153494]: New machine qemu-16-instance-00000015.
Nov 25 19:30:52 compute-0 podman[217528]: 2025-11-25 19:30:52.672883427 +0000 UTC m=+0.085264272 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.676 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.676 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[05197176-3661-4839-a803-20af7ed0bfa4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_controller[95465]: 2025-11-25T19:30:52Z|00170|binding|INFO|Setting lport 55822546-3c29-431d-b662-c78aac24c194 ovn-installed in OVS
Nov 25 19:30:52 compute-0 ovn_controller[95465]: 2025-11-25T19:30:52Z|00171|binding|INFO|Setting lport 55822546-3c29-431d-b662-c78aac24c194 up in Southbound
Nov 25 19:30:52 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000015.
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.684 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.711 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[18a30a3f-b1b6-4a74-9932-b952b7fbc790]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.718 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[9933bc46-16f9-45ce-9a15-23a94dad24cc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 NetworkManager[55552]: <info>  [1764099052.7195] manager: (tap0eeee5bd-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.753 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[8a40ef1e-d869-49fe-adfe-d0f861baba84]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.756 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[f769ddfc-fdf8-4dfd-b630-eef6146b53c1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 NetworkManager[55552]: <info>  [1764099052.7791] device (tap0eeee5bd-f0): carrier: link connected
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.784 210253 DEBUG oslo.privsep.daemon [-] privsep: reply[23399b13-e59e-45dd-89d5-8236ba630de9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.801 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c4d09366-eba3-48df-99d8-a66134b7afcd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0eeee5bd-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:f6:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500044, 'reachable_time': 16480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217593, 'error': None, 'target': 'ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.816 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[4213c3b9-70c9-4593-9dbb-9b8e9109ff91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe81:f63a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500044, 'tstamp': 500044}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217594, 'error': None, 'target': 'ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.829 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[88af9144-57f8-4579-8db2-5dad00281a94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0eeee5bd-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:f6:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500044, 'reachable_time': 16480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217595, 'error': None, 'target': 'ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.840 187216 DEBUG nova.compute.manager [req-4688f3c6-b71f-4c52-9b53-1f8cdfc9e4a2 req-80006f00-6a32-4779-a61a-a0a0ca9a01a3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Received event network-vif-plugged-55822546-3c29-431d-b662-c78aac24c194 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.841 187216 DEBUG oslo_concurrency.lockutils [req-4688f3c6-b71f-4c52-9b53-1f8cdfc9e4a2 req-80006f00-6a32-4779-a61a-a0a0ca9a01a3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.842 187216 DEBUG oslo_concurrency.lockutils [req-4688f3c6-b71f-4c52-9b53-1f8cdfc9e4a2 req-80006f00-6a32-4779-a61a-a0a0ca9a01a3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.842 187216 DEBUG oslo_concurrency.lockutils [req-4688f3c6-b71f-4c52-9b53-1f8cdfc9e4a2 req-80006f00-6a32-4779-a61a-a0a0ca9a01a3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.842 187216 DEBUG nova.compute.manager [req-4688f3c6-b71f-4c52-9b53-1f8cdfc9e4a2 req-80006f00-6a32-4779-a61a-a0a0ca9a01a3 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Processing event network-vif-plugged-55822546-3c29-431d-b662-c78aac24c194 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.863 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[38b33884-ea82-49ba-9bf5-91d6b75c3790]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.921 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.967 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.972 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[a78fe7ac-5b82-49c5-82fa-7aa923d6b4b6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.973 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0eeee5bd-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.974 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.974 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0eeee5bd-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.976 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:52 compute-0 NetworkManager[55552]: <info>  [1764099052.9771] manager: (tap0eeee5bd-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 25 19:30:52 compute-0 kernel: tap0eeee5bd-f0: entered promiscuous mode
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.979 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0eeee5bd-f0, col_values=(('external_ids', {'iface-id': '62bed0d5-309f-407f-9857-408a2f143a2b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:30:52 compute-0 ovn_controller[95465]: 2025-11-25T19:30:52Z|00172|binding|INFO|Releasing lport 62bed0d5-309f-407f-9857-408a2f143a2b from this chassis (sb_readonly=0)
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.980 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:52 compute-0 nova_compute[187212]: 2025-11-25 19:30:52.995 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.996 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[b1dd7be3-38a8-4476-9914-6e06d9455d82]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.997 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.997 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.997 104356 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 0eeee5bd-f568-4881-a684-2e2dd854c2e8 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.998 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.998 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[27fc6b2c-1e01-4183-8da4-2c345784cc42]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.998 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.998 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[17ec9c2f-9758-43e1-bbe8-6de15de3ebc4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:52.999 104356 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: global
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     log         /dev/log local0 debug
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     log-tag     haproxy-metadata-proxy-0eeee5bd-f568-4881-a684-2e2dd854c2e8
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     user        root
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     group       root
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     maxconn     1024
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     pidfile     /var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     daemon
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: defaults
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     log global
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     mode http
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     option httplog
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     option dontlognull
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     option http-server-close
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     option forwardfor
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     retries                 3
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     timeout http-request    30s
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     timeout connect         30s
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     timeout client          32s
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     timeout server          32s
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     timeout http-keep-alive 30s
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: listen listener
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     bind 169.254.169.254:80
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     server metadata /var/lib/neutron/metadata_proxy
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]: 
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:     http-request add-header X-OVN-Network-ID 0eeee5bd-f568-4881-a684-2e2dd854c2e8
Nov 25 19:30:52 compute-0 ovn_metadata_agent[104351]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Nov 25 19:30:53 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:53.000 104356 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'env', 'PROCESS_TAG=haproxy-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0eeee5bd-f568-4881-a684-2e2dd854c2e8.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.123 187216 DEBUG nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.132 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.136 187216 INFO nova.virt.libvirt.driver [-] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Instance spawned successfully.
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.137 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Nov 25 19:30:53 compute-0 podman[217633]: 2025-11-25 19:30:53.430869311 +0000 UTC m=+0.083005713 container create a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251125)
Nov 25 19:30:53 compute-0 podman[217633]: 2025-11-25 19:30:53.378015201 +0000 UTC m=+0.030151663 image pull 8a28ec94bf56c5a892878d39caba33e13c3fdf7366ca4cea65d7c66566a6eb1b 38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Nov 25 19:30:53 compute-0 systemd[1]: Started libpod-conmon-a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95.scope.
Nov 25 19:30:53 compute-0 systemd[1]: Started libcrun container.
Nov 25 19:30:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a685e041299f65dd203d0f9ede0dc6483fb88a5a4a7a56d0af62d8e24fe167b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 25 19:30:53 compute-0 podman[217633]: 2025-11-25 19:30:53.568288193 +0000 UTC m=+0.220424655 container init a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:30:53 compute-0 podman[217633]: 2025-11-25 19:30:53.57808843 +0000 UTC m=+0.230224842 container start a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 19:30:53 compute-0 neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8[217648]: [NOTICE]   (217652) : New worker (217654) forked
Nov 25 19:30:53 compute-0 neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8[217648]: [NOTICE]   (217652) : Loading success.
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.651 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.652 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.653 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.654 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.655 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:30:53 compute-0 nova_compute[187212]: 2025-11-25 19:30:53.655 187216 DEBUG nova.virt.libvirt.driver [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Nov 25 19:30:53 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:53.672 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.170 187216 INFO nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Took 11.35 seconds to spawn the instance on the hypervisor.
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.171 187216 DEBUG nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.725 187216 INFO nova.compute.manager [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Took 17.30 seconds to build instance.
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.748 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.814 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.947 187216 DEBUG nova.compute.manager [req-90831f47-fb0c-4fd1-9cad-0e84d86c0f21 req-6de0b9b9-ac88-4c71-965c-96ed5d6de6d0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Received event network-vif-plugged-55822546-3c29-431d-b662-c78aac24c194 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.948 187216 DEBUG oslo_concurrency.lockutils [req-90831f47-fb0c-4fd1-9cad-0e84d86c0f21 req-6de0b9b9-ac88-4c71-965c-96ed5d6de6d0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.948 187216 DEBUG oslo_concurrency.lockutils [req-90831f47-fb0c-4fd1-9cad-0e84d86c0f21 req-6de0b9b9-ac88-4c71-965c-96ed5d6de6d0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.949 187216 DEBUG oslo_concurrency.lockutils [req-90831f47-fb0c-4fd1-9cad-0e84d86c0f21 req-6de0b9b9-ac88-4c71-965c-96ed5d6de6d0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.949 187216 DEBUG nova.compute.manager [req-90831f47-fb0c-4fd1-9cad-0e84d86c0f21 req-6de0b9b9-ac88-4c71-965c-96ed5d6de6d0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] No waiting events found dispatching network-vif-plugged-55822546-3c29-431d-b662-c78aac24c194 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:30:54 compute-0 nova_compute[187212]: 2025-11-25 19:30:54.950 187216 WARNING nova.compute.manager [req-90831f47-fb0c-4fd1-9cad-0e84d86c0f21 req-6de0b9b9-ac88-4c71-965c-96ed5d6de6d0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Received unexpected event network-vif-plugged-55822546-3c29-431d-b662-c78aac24c194 for instance with vm_state active and task_state None.
Nov 25 19:30:55 compute-0 nova_compute[187212]: 2025-11-25 19:30:55.233 187216 DEBUG oslo_concurrency.lockutils [None req-bedf9ff0-7ef4-43bd-9910-aacd6fbc2a9e 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.995s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:30:57 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:30:57.674 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:30:58 compute-0 sshd-session[217664]: Invalid user admin from 209.38.103.174 port 52160
Nov 25 19:30:58 compute-0 sshd-session[217664]: Connection closed by invalid user admin 209.38.103.174 port 52160 [preauth]
Nov 25 19:30:59 compute-0 podman[197585]: time="2025-11-25T19:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:30:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:30:59 compute-0 nova_compute[187212]: 2025-11-25 19:30:59.806 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:30:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3548 "" "Go-http-client/1.1"
Nov 25 19:30:59 compute-0 nova_compute[187212]: 2025-11-25 19:30:59.817 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:01 compute-0 podman[217666]: 2025-11-25 19:31:01.224949375 +0000 UTC m=+0.140658549 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 19:31:01 compute-0 openstack_network_exporter[199731]: ERROR   19:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:31:01 compute-0 openstack_network_exporter[199731]: ERROR   19:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:31:01 compute-0 openstack_network_exporter[199731]: ERROR   19:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:31:01 compute-0 openstack_network_exporter[199731]: ERROR   19:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:31:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:31:01 compute-0 openstack_network_exporter[199731]: ERROR   19:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:31:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:31:04 compute-0 podman[217694]: 2025-11-25 19:31:04.176441113 +0000 UTC m=+0.088730333 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:31:04 compute-0 nova_compute[187212]: 2025-11-25 19:31:04.808 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:04 compute-0 nova_compute[187212]: 2025-11-25 19:31:04.819 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:06 compute-0 ovn_controller[95465]: 2025-11-25T19:31:06Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fb:93:6a 10.100.0.8
Nov 25 19:31:06 compute-0 ovn_controller[95465]: 2025-11-25T19:31:06Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fb:93:6a 10.100.0.8
Nov 25 19:31:08 compute-0 podman[217731]: 2025-11-25 19:31:08.182670913 +0000 UTC m=+0.095852170 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:31:09 compute-0 nova_compute[187212]: 2025-11-25 19:31:09.820 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:09 compute-0 nova_compute[187212]: 2025-11-25 19:31:09.822 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:09 compute-0 nova_compute[187212]: 2025-11-25 19:31:09.823 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:31:09 compute-0 nova_compute[187212]: 2025-11-25 19:31:09.823 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:09 compute-0 nova_compute[187212]: 2025-11-25 19:31:09.838 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:09 compute-0 nova_compute[187212]: 2025-11-25 19:31:09.838 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:11 compute-0 podman[217752]: 2025-11-25 19:31:11.178341143 +0000 UTC m=+0.090168801 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Nov 25 19:31:14 compute-0 nova_compute[187212]: 2025-11-25 19:31:14.839 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:14 compute-0 nova_compute[187212]: 2025-11-25 19:31:14.841 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:14 compute-0 nova_compute[187212]: 2025-11-25 19:31:14.841 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:31:14 compute-0 nova_compute[187212]: 2025-11-25 19:31:14.841 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:14 compute-0 nova_compute[187212]: 2025-11-25 19:31:14.869 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:14 compute-0 nova_compute[187212]: 2025-11-25 19:31:14.869 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:19 compute-0 nova_compute[187212]: 2025-11-25 19:31:19.870 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:23 compute-0 ovn_controller[95465]: 2025-11-25T19:31:23Z|00173|memory_trim|INFO|Detected inactivity (last active 30023 ms ago): trimming memory
Nov 25 19:31:23 compute-0 podman[217773]: 2025-11-25 19:31:23.157338224 +0000 UTC m=+0.082453038 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:31:24 compute-0 nova_compute[187212]: 2025-11-25 19:31:24.872 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:24 compute-0 nova_compute[187212]: 2025-11-25 19:31:24.874 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:24 compute-0 nova_compute[187212]: 2025-11-25 19:31:24.874 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:31:24 compute-0 nova_compute[187212]: 2025-11-25 19:31:24.874 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:24 compute-0 nova_compute[187212]: 2025-11-25 19:31:24.912 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:24 compute-0 nova_compute[187212]: 2025-11-25 19:31:24.913 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:29 compute-0 podman[197585]: time="2025-11-25T19:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:31:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:31:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3555 "" "Go-http-client/1.1"
Nov 25 19:31:29 compute-0 nova_compute[187212]: 2025-11-25 19:31:29.914 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:29 compute-0 nova_compute[187212]: 2025-11-25 19:31:29.917 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:31:31.117 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:31:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:31:31.118 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:31:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:31:31.118 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:31:31 compute-0 nova_compute[187212]: 2025-11-25 19:31:31.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:31:31 compute-0 nova_compute[187212]: 2025-11-25 19:31:31.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:31:31 compute-0 openstack_network_exporter[199731]: ERROR   19:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:31:31 compute-0 openstack_network_exporter[199731]: ERROR   19:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:31:31 compute-0 openstack_network_exporter[199731]: ERROR   19:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:31:31 compute-0 openstack_network_exporter[199731]: ERROR   19:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:31:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:31:31 compute-0 openstack_network_exporter[199731]: ERROR   19:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:31:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:31:32 compute-0 podman[217799]: 2025-11-25 19:31:32.178750566 +0000 UTC m=+0.106157531 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 25 19:31:34 compute-0 nova_compute[187212]: 2025-11-25 19:31:34.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:31:34 compute-0 nova_compute[187212]: 2025-11-25 19:31:34.918 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:34 compute-0 nova_compute[187212]: 2025-11-25 19:31:34.919 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:35 compute-0 podman[217825]: 2025-11-25 19:31:35.131771855 +0000 UTC m=+0.055773757 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 19:31:36 compute-0 nova_compute[187212]: 2025-11-25 19:31:36.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:31:36 compute-0 nova_compute[187212]: 2025-11-25 19:31:36.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:31:36 compute-0 nova_compute[187212]: 2025-11-25 19:31:36.698 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:31:36 compute-0 nova_compute[187212]: 2025-11-25 19:31:36.699 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:31:36 compute-0 nova_compute[187212]: 2025-11-25 19:31:36.699 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:31:36 compute-0 nova_compute[187212]: 2025-11-25 19:31:36.700 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:31:37 compute-0 nova_compute[187212]: 2025-11-25 19:31:37.753 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:31:37 compute-0 nova_compute[187212]: 2025-11-25 19:31:37.822 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:31:37 compute-0 nova_compute[187212]: 2025-11-25 19:31:37.823 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:31:37 compute-0 nova_compute[187212]: 2025-11-25 19:31:37.872 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:31:37 compute-0 nova_compute[187212]: 2025-11-25 19:31:37.878 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:31:37 compute-0 nova_compute[187212]: 2025-11-25 19:31:37.933 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:31:37 compute-0 nova_compute[187212]: 2025-11-25 19:31:37.934 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:31:38 compute-0 nova_compute[187212]: 2025-11-25 19:31:38.000 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:31:38 compute-0 nova_compute[187212]: 2025-11-25 19:31:38.236 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:31:38 compute-0 nova_compute[187212]: 2025-11-25 19:31:38.239 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:31:38 compute-0 nova_compute[187212]: 2025-11-25 19:31:38.269 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:31:38 compute-0 nova_compute[187212]: 2025-11-25 19:31:38.270 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5496MB free_disk=72.93493270874023GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:31:38 compute-0 nova_compute[187212]: 2025-11-25 19:31:38.270 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:31:38 compute-0 nova_compute[187212]: 2025-11-25 19:31:38.271 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:31:39 compute-0 podman[217858]: 2025-11-25 19:31:39.162771637 +0000 UTC m=+0.084797940 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.849 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.849 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.849 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.850 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:31:38 up  1:24,  0 user,  load average: 0.18, 0.23, 0.34\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.893 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.919 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.920 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.921 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.921 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.921 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:39 compute-0 nova_compute[187212]: 2025-11-25 19:31:39.922 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:40 compute-0 nova_compute[187212]: 2025-11-25 19:31:40.401 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:31:40 compute-0 nova_compute[187212]: 2025-11-25 19:31:40.913 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:31:40 compute-0 nova_compute[187212]: 2025-11-25 19:31:40.913 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.643s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:31:41 compute-0 nova_compute[187212]: 2025-11-25 19:31:41.910 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:31:41 compute-0 nova_compute[187212]: 2025-11-25 19:31:41.911 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:31:42 compute-0 podman[217880]: 2025-11-25 19:31:42.142083427 +0000 UTC m=+0.061203299 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 19:31:42 compute-0 nova_compute[187212]: 2025-11-25 19:31:42.422 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:31:44 compute-0 nova_compute[187212]: 2025-11-25 19:31:44.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:31:44 compute-0 nova_compute[187212]: 2025-11-25 19:31:44.924 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:44 compute-0 nova_compute[187212]: 2025-11-25 19:31:44.925 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:44 compute-0 nova_compute[187212]: 2025-11-25 19:31:44.925 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:31:44 compute-0 nova_compute[187212]: 2025-11-25 19:31:44.925 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:44 compute-0 nova_compute[187212]: 2025-11-25 19:31:44.955 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:44 compute-0 nova_compute[187212]: 2025-11-25 19:31:44.956 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:47 compute-0 nova_compute[187212]: 2025-11-25 19:31:47.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:31:49 compute-0 sshd-session[217902]: Invalid user admin from 209.38.103.174 port 36346
Nov 25 19:31:49 compute-0 sshd-session[217902]: Connection closed by invalid user admin 209.38.103.174 port 36346 [preauth]
Nov 25 19:31:49 compute-0 nova_compute[187212]: 2025-11-25 19:31:49.956 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:31:49 compute-0 nova_compute[187212]: 2025-11-25 19:31:49.958 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:49 compute-0 nova_compute[187212]: 2025-11-25 19:31:49.958 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:31:49 compute-0 nova_compute[187212]: 2025-11-25 19:31:49.958 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:49 compute-0 nova_compute[187212]: 2025-11-25 19:31:49.959 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:31:49 compute-0 nova_compute[187212]: 2025-11-25 19:31:49.960 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:54 compute-0 podman[217904]: 2025-11-25 19:31:54.181337735 +0000 UTC m=+0.083675950 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:31:54 compute-0 nova_compute[187212]: 2025-11-25 19:31:54.960 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:31:59 compute-0 podman[197585]: time="2025-11-25T19:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:31:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:31:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3550 "" "Go-http-client/1.1"
Nov 25 19:31:59 compute-0 nova_compute[187212]: 2025-11-25 19:31:59.962 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:01 compute-0 openstack_network_exporter[199731]: ERROR   19:32:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:32:01 compute-0 openstack_network_exporter[199731]: ERROR   19:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:32:01 compute-0 openstack_network_exporter[199731]: ERROR   19:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:32:01 compute-0 openstack_network_exporter[199731]: ERROR   19:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:32:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:32:01 compute-0 openstack_network_exporter[199731]: ERROR   19:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:32:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:32:03 compute-0 podman[217929]: 2025-11-25 19:32:03.168336223 +0000 UTC m=+0.093092008 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Nov 25 19:32:04 compute-0 nova_compute[187212]: 2025-11-25 19:32:04.966 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:04 compute-0 nova_compute[187212]: 2025-11-25 19:32:04.970 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:06 compute-0 podman[217956]: 2025-11-25 19:32:06.170190845 +0000 UTC m=+0.084993185 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 19:32:09 compute-0 nova_compute[187212]: 2025-11-25 19:32:09.971 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:09 compute-0 nova_compute[187212]: 2025-11-25 19:32:09.973 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:09 compute-0 nova_compute[187212]: 2025-11-25 19:32:09.974 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:32:09 compute-0 nova_compute[187212]: 2025-11-25 19:32:09.974 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:10 compute-0 nova_compute[187212]: 2025-11-25 19:32:10.000 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:10 compute-0 nova_compute[187212]: 2025-11-25 19:32:10.001 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:10 compute-0 podman[217990]: 2025-11-25 19:32:10.192969724 +0000 UTC m=+0.110912786 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350)
Nov 25 19:32:13 compute-0 podman[218012]: 2025-11-25 19:32:13.207496318 +0000 UTC m=+0.113340680 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd)
Nov 25 19:32:15 compute-0 nova_compute[187212]: 2025-11-25 19:32:15.002 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:15 compute-0 nova_compute[187212]: 2025-11-25 19:32:15.041 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:15 compute-0 nova_compute[187212]: 2025-11-25 19:32:15.041 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:32:15 compute-0 nova_compute[187212]: 2025-11-25 19:32:15.041 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:15 compute-0 nova_compute[187212]: 2025-11-25 19:32:15.042 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:15 compute-0 nova_compute[187212]: 2025-11-25 19:32:15.043 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:20 compute-0 nova_compute[187212]: 2025-11-25 19:32:20.044 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:25 compute-0 nova_compute[187212]: 2025-11-25 19:32:25.047 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:25 compute-0 nova_compute[187212]: 2025-11-25 19:32:25.052 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:25 compute-0 nova_compute[187212]: 2025-11-25 19:32:25.053 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:32:25 compute-0 nova_compute[187212]: 2025-11-25 19:32:25.053 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:25 compute-0 nova_compute[187212]: 2025-11-25 19:32:25.095 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:25 compute-0 nova_compute[187212]: 2025-11-25 19:32:25.095 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:25 compute-0 podman[218032]: 2025-11-25 19:32:25.233556896 +0000 UTC m=+0.108638926 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:32:29 compute-0 podman[197585]: time="2025-11-25T19:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:32:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:32:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3549 "" "Go-http-client/1.1"
Nov 25 19:32:30 compute-0 nova_compute[187212]: 2025-11-25 19:32:30.150 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:30 compute-0 nova_compute[187212]: 2025-11-25 19:32:30.151 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:30 compute-0 nova_compute[187212]: 2025-11-25 19:32:30.151 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5056 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:32:30 compute-0 nova_compute[187212]: 2025-11-25 19:32:30.152 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:30 compute-0 nova_compute[187212]: 2025-11-25 19:32:30.152 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:30 compute-0 nova_compute[187212]: 2025-11-25 19:32:30.153 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:32:31.119 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:32:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:32:31.120 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:32:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:32:31.121 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:32:31 compute-0 openstack_network_exporter[199731]: ERROR   19:32:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:32:31 compute-0 openstack_network_exporter[199731]: ERROR   19:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:32:31 compute-0 openstack_network_exporter[199731]: ERROR   19:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:32:31 compute-0 openstack_network_exporter[199731]: ERROR   19:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:32:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:32:31 compute-0 openstack_network_exporter[199731]: ERROR   19:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:32:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:32:32 compute-0 nova_compute[187212]: 2025-11-25 19:32:32.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:32:32 compute-0 nova_compute[187212]: 2025-11-25 19:32:32.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:32:34 compute-0 podman[218057]: 2025-11-25 19:32:34.224684786 +0000 UTC m=+0.140459033 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Nov 25 19:32:35 compute-0 nova_compute[187212]: 2025-11-25 19:32:35.154 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:36 compute-0 nova_compute[187212]: 2025-11-25 19:32:36.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:32:37 compute-0 nova_compute[187212]: 2025-11-25 19:32:37.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:32:37 compute-0 podman[218083]: 2025-11-25 19:32:37.170696461 +0000 UTC m=+0.100606456 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 19:32:38 compute-0 nova_compute[187212]: 2025-11-25 19:32:38.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:32:38 compute-0 nova_compute[187212]: 2025-11-25 19:32:38.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:32:38 compute-0 nova_compute[187212]: 2025-11-25 19:32:38.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:32:38 compute-0 nova_compute[187212]: 2025-11-25 19:32:38.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:32:38 compute-0 nova_compute[187212]: 2025-11-25 19:32:38.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:32:38 compute-0 nova_compute[187212]: 2025-11-25 19:32:38.694 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:32:39 compute-0 nova_compute[187212]: 2025-11-25 19:32:39.737 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:32:39 compute-0 nova_compute[187212]: 2025-11-25 19:32:39.799 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:32:39 compute-0 nova_compute[187212]: 2025-11-25 19:32:39.801 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:32:39 compute-0 nova_compute[187212]: 2025-11-25 19:32:39.861 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:32:39 compute-0 nova_compute[187212]: 2025-11-25 19:32:39.869 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:32:39 compute-0 nova_compute[187212]: 2025-11-25 19:32:39.929 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:32:39 compute-0 nova_compute[187212]: 2025-11-25 19:32:39.930 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:32:40 compute-0 nova_compute[187212]: 2025-11-25 19:32:40.020 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:32:40 compute-0 nova_compute[187212]: 2025-11-25 19:32:40.156 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:40 compute-0 nova_compute[187212]: 2025-11-25 19:32:40.293 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:32:40 compute-0 nova_compute[187212]: 2025-11-25 19:32:40.295 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:32:40 compute-0 nova_compute[187212]: 2025-11-25 19:32:40.324 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:32:40 compute-0 nova_compute[187212]: 2025-11-25 19:32:40.325 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5519MB free_disk=72.9350471496582GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:32:40 compute-0 nova_compute[187212]: 2025-11-25 19:32:40.325 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:32:40 compute-0 nova_compute[187212]: 2025-11-25 19:32:40.326 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:32:41 compute-0 podman[218115]: 2025-11-25 19:32:41.144849687 +0000 UTC m=+0.076446410 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm)
Nov 25 19:32:41 compute-0 nova_compute[187212]: 2025-11-25 19:32:41.898 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:32:41 compute-0 nova_compute[187212]: 2025-11-25 19:32:41.898 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:32:41 compute-0 nova_compute[187212]: 2025-11-25 19:32:41.898 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:32:41 compute-0 nova_compute[187212]: 2025-11-25 19:32:41.899 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:32:40 up  1:25,  0 user,  load average: 0.09, 0.20, 0.32\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:32:41 compute-0 nova_compute[187212]: 2025-11-25 19:32:41.955 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:32:42 compute-0 nova_compute[187212]: 2025-11-25 19:32:42.462 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:32:42 compute-0 nova_compute[187212]: 2025-11-25 19:32:42.972 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:32:42 compute-0 nova_compute[187212]: 2025-11-25 19:32:42.972 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.646s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:32:43 compute-0 sshd-session[218136]: Invalid user admin from 209.38.103.174 port 49792
Nov 25 19:32:43 compute-0 sshd-session[218136]: Connection closed by invalid user admin 209.38.103.174 port 49792 [preauth]
Nov 25 19:32:44 compute-0 podman[218138]: 2025-11-25 19:32:44.162468013 +0000 UTC m=+0.079419568 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 19:32:45 compute-0 nova_compute[187212]: 2025-11-25 19:32:45.158 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:45 compute-0 nova_compute[187212]: 2025-11-25 19:32:45.972 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:32:45 compute-0 nova_compute[187212]: 2025-11-25 19:32:45.972 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:32:48 compute-0 nova_compute[187212]: 2025-11-25 19:32:48.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:32:50 compute-0 nova_compute[187212]: 2025-11-25 19:32:50.161 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:55 compute-0 nova_compute[187212]: 2025-11-25 19:32:55.163 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:32:55 compute-0 nova_compute[187212]: 2025-11-25 19:32:55.165 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:55 compute-0 nova_compute[187212]: 2025-11-25 19:32:55.165 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:32:55 compute-0 nova_compute[187212]: 2025-11-25 19:32:55.165 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:55 compute-0 nova_compute[187212]: 2025-11-25 19:32:55.166 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:32:55 compute-0 nova_compute[187212]: 2025-11-25 19:32:55.167 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:32:56 compute-0 podman[218158]: 2025-11-25 19:32:56.142497178 +0000 UTC m=+0.067363801 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:32:59 compute-0 podman[197585]: time="2025-11-25T19:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:32:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:32:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3556 "" "Go-http-client/1.1"
Nov 25 19:33:00 compute-0 nova_compute[187212]: 2025-11-25 19:33:00.166 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:01 compute-0 openstack_network_exporter[199731]: ERROR   19:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:33:01 compute-0 openstack_network_exporter[199731]: ERROR   19:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:33:01 compute-0 openstack_network_exporter[199731]: ERROR   19:33:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:33:01 compute-0 openstack_network_exporter[199731]: ERROR   19:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:33:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:33:01 compute-0 openstack_network_exporter[199731]: ERROR   19:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:33:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:33:05 compute-0 nova_compute[187212]: 2025-11-25 19:33:05.169 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:05 compute-0 podman[218182]: 2025-11-25 19:33:05.217522107 +0000 UTC m=+0.134703481 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Nov 25 19:33:08 compute-0 podman[218210]: 2025-11-25 19:33:08.169909076 +0000 UTC m=+0.086624312 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 19:33:10 compute-0 nova_compute[187212]: 2025-11-25 19:33:10.171 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:12 compute-0 podman[218229]: 2025-11-25 19:33:12.168310544 +0000 UTC m=+0.086986391 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 19:33:15 compute-0 nova_compute[187212]: 2025-11-25 19:33:15.175 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:15 compute-0 podman[218251]: 2025-11-25 19:33:15.181635067 +0000 UTC m=+0.089471125 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 25 19:33:20 compute-0 nova_compute[187212]: 2025-11-25 19:33:20.178 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:33:20 compute-0 nova_compute[187212]: 2025-11-25 19:33:20.179 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:33:20 compute-0 nova_compute[187212]: 2025-11-25 19:33:20.180 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:33:20 compute-0 nova_compute[187212]: 2025-11-25 19:33:20.180 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:33:20 compute-0 nova_compute[187212]: 2025-11-25 19:33:20.221 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:20 compute-0 nova_compute[187212]: 2025-11-25 19:33:20.221 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:33:25 compute-0 nova_compute[187212]: 2025-11-25 19:33:25.222 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:33:27 compute-0 podman[218271]: 2025-11-25 19:33:27.167395905 +0000 UTC m=+0.077595997 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:33:29 compute-0 podman[197585]: time="2025-11-25T19:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:33:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:33:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3553 "" "Go-http-client/1.1"
Nov 25 19:33:30 compute-0 nova_compute[187212]: 2025-11-25 19:33:30.226 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:33:30 compute-0 nova_compute[187212]: 2025-11-25 19:33:30.229 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:33:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:33:31.122 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:33:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:33:31.123 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:33:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:33:31.124 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:33:31 compute-0 openstack_network_exporter[199731]: ERROR   19:33:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:33:31 compute-0 openstack_network_exporter[199731]: ERROR   19:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:33:31 compute-0 openstack_network_exporter[199731]: ERROR   19:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:33:31 compute-0 openstack_network_exporter[199731]: ERROR   19:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:33:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:33:31 compute-0 openstack_network_exporter[199731]: ERROR   19:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:33:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:33:32 compute-0 sshd-session[218296]: Invalid user admin from 209.38.103.174 port 55700
Nov 25 19:33:32 compute-0 sshd-session[218296]: Connection closed by invalid user admin 209.38.103.174 port 55700 [preauth]
Nov 25 19:33:34 compute-0 nova_compute[187212]: 2025-11-25 19:33:34.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:33:34 compute-0 nova_compute[187212]: 2025-11-25 19:33:34.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:33:35 compute-0 nova_compute[187212]: 2025-11-25 19:33:35.229 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:33:35 compute-0 nova_compute[187212]: 2025-11-25 19:33:35.230 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:35 compute-0 nova_compute[187212]: 2025-11-25 19:33:35.230 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:33:35 compute-0 nova_compute[187212]: 2025-11-25 19:33:35.231 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:33:35 compute-0 nova_compute[187212]: 2025-11-25 19:33:35.232 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:33:35 compute-0 nova_compute[187212]: 2025-11-25 19:33:35.233 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:36 compute-0 podman[218298]: 2025-11-25 19:33:36.21293253 +0000 UTC m=+0.133085848 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Nov 25 19:33:37 compute-0 nova_compute[187212]: 2025-11-25 19:33:37.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:33:38 compute-0 nova_compute[187212]: 2025-11-25 19:33:38.171 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:33:39 compute-0 nova_compute[187212]: 2025-11-25 19:33:39.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:33:39 compute-0 podman[218324]: 2025-11-25 19:33:39.179687524 +0000 UTC m=+0.092777752 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:33:40 compute-0 nova_compute[187212]: 2025-11-25 19:33:40.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:33:40 compute-0 nova_compute[187212]: 2025-11-25 19:33:40.232 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:40 compute-0 nova_compute[187212]: 2025-11-25 19:33:40.234 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:40 compute-0 nova_compute[187212]: 2025-11-25 19:33:40.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:33:40 compute-0 nova_compute[187212]: 2025-11-25 19:33:40.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:33:40 compute-0 nova_compute[187212]: 2025-11-25 19:33:40.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:33:40 compute-0 nova_compute[187212]: 2025-11-25 19:33:40.687 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:33:41 compute-0 nova_compute[187212]: 2025-11-25 19:33:41.737 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:33:41 compute-0 nova_compute[187212]: 2025-11-25 19:33:41.827 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:33:41 compute-0 nova_compute[187212]: 2025-11-25 19:33:41.829 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:33:41 compute-0 nova_compute[187212]: 2025-11-25 19:33:41.895 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:33:41 compute-0 nova_compute[187212]: 2025-11-25 19:33:41.902 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:33:41 compute-0 nova_compute[187212]: 2025-11-25 19:33:41.953 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:33:41 compute-0 nova_compute[187212]: 2025-11-25 19:33:41.954 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:33:42 compute-0 nova_compute[187212]: 2025-11-25 19:33:42.003 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:33:42 compute-0 nova_compute[187212]: 2025-11-25 19:33:42.226 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:33:42 compute-0 nova_compute[187212]: 2025-11-25 19:33:42.228 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:33:42 compute-0 nova_compute[187212]: 2025-11-25 19:33:42.259 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:33:42 compute-0 nova_compute[187212]: 2025-11-25 19:33:42.260 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5498MB free_disk=72.93506622314453GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:33:42 compute-0 nova_compute[187212]: 2025-11-25 19:33:42.260 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:33:42 compute-0 nova_compute[187212]: 2025-11-25 19:33:42.261 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:33:43 compute-0 podman[218356]: 2025-11-25 19:33:43.164646892 +0000 UTC m=+0.081515089 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, version=9.6, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Nov 25 19:33:43 compute-0 nova_compute[187212]: 2025-11-25 19:33:43.816 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:33:43 compute-0 nova_compute[187212]: 2025-11-25 19:33:43.816 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:33:43 compute-0 nova_compute[187212]: 2025-11-25 19:33:43.817 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:33:43 compute-0 nova_compute[187212]: 2025-11-25 19:33:43.817 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:33:42 up  1:26,  0 user,  load average: 0.06, 0.17, 0.31\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:33:43 compute-0 nova_compute[187212]: 2025-11-25 19:33:43.871 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:33:44 compute-0 sshd-session[218378]: Connection closed by 203.83.238.251 port 40880
Nov 25 19:33:44 compute-0 nova_compute[187212]: 2025-11-25 19:33:44.379 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:33:44 compute-0 nova_compute[187212]: 2025-11-25 19:33:44.888 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:33:44 compute-0 nova_compute[187212]: 2025-11-25 19:33:44.888 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.627s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:33:45 compute-0 nova_compute[187212]: 2025-11-25 19:33:45.234 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:33:46 compute-0 podman[218379]: 2025-11-25 19:33:46.166204078 +0000 UTC m=+0.083163471 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.4, config_id=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:33:47 compute-0 nova_compute[187212]: 2025-11-25 19:33:47.884 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:33:48 compute-0 nova_compute[187212]: 2025-11-25 19:33:48.444 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:33:48 compute-0 nova_compute[187212]: 2025-11-25 19:33:48.445 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:33:50 compute-0 nova_compute[187212]: 2025-11-25 19:33:50.309 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:50 compute-0 nova_compute[187212]: 2025-11-25 19:33:50.314 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:33:50 compute-0 nova_compute[187212]: 2025-11-25 19:33:50.315 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:33:55 compute-0 nova_compute[187212]: 2025-11-25 19:33:55.244 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:55 compute-0 nova_compute[187212]: 2025-11-25 19:33:55.311 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:33:58 compute-0 podman[218401]: 2025-11-25 19:33:58.183245739 +0000 UTC m=+0.088889751 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:33:59 compute-0 podman[197585]: time="2025-11-25T19:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:33:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:33:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3553 "" "Go-http-client/1.1"
Nov 25 19:34:00 compute-0 nova_compute[187212]: 2025-11-25 19:34:00.246 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:00 compute-0 nova_compute[187212]: 2025-11-25 19:34:00.313 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:01 compute-0 openstack_network_exporter[199731]: ERROR   19:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:34:01 compute-0 openstack_network_exporter[199731]: ERROR   19:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:34:01 compute-0 openstack_network_exporter[199731]: ERROR   19:34:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:34:01 compute-0 openstack_network_exporter[199731]: ERROR   19:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:34:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:34:01 compute-0 openstack_network_exporter[199731]: ERROR   19:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:34:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:34:05 compute-0 nova_compute[187212]: 2025-11-25 19:34:05.248 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:05 compute-0 nova_compute[187212]: 2025-11-25 19:34:05.314 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:07 compute-0 podman[218425]: 2025-11-25 19:34:07.188796914 +0000 UTC m=+0.111177550 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Nov 25 19:34:10 compute-0 podman[218454]: 2025-11-25 19:34:10.168658968 +0000 UTC m=+0.085765220 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Nov 25 19:34:10 compute-0 nova_compute[187212]: 2025-11-25 19:34:10.291 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:10 compute-0 nova_compute[187212]: 2025-11-25 19:34:10.316 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:14 compute-0 podman[218474]: 2025-11-25 19:34:14.145904596 +0000 UTC m=+0.069160598 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, architecture=x86_64)
Nov 25 19:34:15 compute-0 nova_compute[187212]: 2025-11-25 19:34:15.300 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:15 compute-0 nova_compute[187212]: 2025-11-25 19:34:15.318 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:17 compute-0 podman[218496]: 2025-11-25 19:34:17.175258166 +0000 UTC m=+0.090243316 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Nov 25 19:34:20 compute-0 nova_compute[187212]: 2025-11-25 19:34:20.308 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:20 compute-0 nova_compute[187212]: 2025-11-25 19:34:20.319 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:25 compute-0 nova_compute[187212]: 2025-11-25 19:34:25.310 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:25 compute-0 nova_compute[187212]: 2025-11-25 19:34:25.320 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:28 compute-0 sshd-session[218517]: Connection closed by 209.38.103.174 port 43390
Nov 25 19:34:29 compute-0 sshd-session[218518]: Invalid user admin from 209.38.103.174 port 43396
Nov 25 19:34:29 compute-0 podman[218520]: 2025-11-25 19:34:29.114769331 +0000 UTC m=+0.065774970 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:34:29 compute-0 sshd-session[218518]: Connection closed by invalid user admin 209.38.103.174 port 43396 [preauth]
Nov 25 19:34:29 compute-0 podman[197585]: time="2025-11-25T19:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:34:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:34:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3551 "" "Go-http-client/1.1"
Nov 25 19:34:30 compute-0 nova_compute[187212]: 2025-11-25 19:34:30.321 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:34:31.125 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:34:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:34:31.125 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:34:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:34:31.126 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:34:31 compute-0 openstack_network_exporter[199731]: ERROR   19:34:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:34:31 compute-0 openstack_network_exporter[199731]: ERROR   19:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:34:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:34:31 compute-0 openstack_network_exporter[199731]: ERROR   19:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:34:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:34:31 compute-0 openstack_network_exporter[199731]: ERROR   19:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:34:31 compute-0 openstack_network_exporter[199731]: ERROR   19:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:34:32 compute-0 sshd-session[218545]: Connection closed by 111.61.229.78 port 38252
Nov 25 19:34:35 compute-0 nova_compute[187212]: 2025-11-25 19:34:35.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:35 compute-0 nova_compute[187212]: 2025-11-25 19:34:35.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:34:35 compute-0 nova_compute[187212]: 2025-11-25 19:34:35.323 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:38 compute-0 podman[218546]: 2025-11-25 19:34:38.21479847 +0000 UTC m=+0.131551199 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Nov 25 19:34:39 compute-0 nova_compute[187212]: 2025-11-25 19:34:39.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:39 compute-0 nova_compute[187212]: 2025-11-25 19:34:39.176 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:40 compute-0 nova_compute[187212]: 2025-11-25 19:34:40.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:40 compute-0 nova_compute[187212]: 2025-11-25 19:34:40.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:40 compute-0 nova_compute[187212]: 2025-11-25 19:34:40.325 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:34:40 compute-0 nova_compute[187212]: 2025-11-25 19:34:40.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:34:40 compute-0 nova_compute[187212]: 2025-11-25 19:34:40.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:34:40 compute-0 nova_compute[187212]: 2025-11-25 19:34:40.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:34:40 compute-0 nova_compute[187212]: 2025-11-25 19:34:40.689 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:34:41 compute-0 podman[218574]: 2025-11-25 19:34:41.153146014 +0000 UTC m=+0.073204123 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 19:34:41 compute-0 nova_compute[187212]: 2025-11-25 19:34:41.743 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:34:41 compute-0 nova_compute[187212]: 2025-11-25 19:34:41.834 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:34:41 compute-0 nova_compute[187212]: 2025-11-25 19:34:41.835 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:34:41 compute-0 nova_compute[187212]: 2025-11-25 19:34:41.908 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:34:41 compute-0 nova_compute[187212]: 2025-11-25 19:34:41.916 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:34:41 compute-0 nova_compute[187212]: 2025-11-25 19:34:41.975 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:34:41 compute-0 nova_compute[187212]: 2025-11-25 19:34:41.977 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:34:42 compute-0 nova_compute[187212]: 2025-11-25 19:34:42.027 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:34:42 compute-0 nova_compute[187212]: 2025-11-25 19:34:42.298 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:34:42 compute-0 nova_compute[187212]: 2025-11-25 19:34:42.300 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:34:42 compute-0 nova_compute[187212]: 2025-11-25 19:34:42.330 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:34:42 compute-0 nova_compute[187212]: 2025-11-25 19:34:42.331 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5438MB free_disk=72.93506622314453GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:34:42 compute-0 nova_compute[187212]: 2025-11-25 19:34:42.331 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:34:42 compute-0 nova_compute[187212]: 2025-11-25 19:34:42.332 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:34:43 compute-0 nova_compute[187212]: 2025-11-25 19:34:43.901 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:34:43 compute-0 nova_compute[187212]: 2025-11-25 19:34:43.901 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:34:43 compute-0 nova_compute[187212]: 2025-11-25 19:34:43.902 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:34:43 compute-0 nova_compute[187212]: 2025-11-25 19:34:43.902 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:34:42 up  1:27,  0 user,  load average: 0.20, 0.19, 0.30\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:34:43 compute-0 nova_compute[187212]: 2025-11-25 19:34:43.976 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:34:44 compute-0 nova_compute[187212]: 2025-11-25 19:34:44.493 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:34:45 compute-0 nova_compute[187212]: 2025-11-25 19:34:45.005 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:34:45 compute-0 nova_compute[187212]: 2025-11-25 19:34:45.005 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.673s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:34:45 compute-0 podman[218607]: 2025-11-25 19:34:45.168794071 +0000 UTC m=+0.086854518 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Nov 25 19:34:45 compute-0 nova_compute[187212]: 2025-11-25 19:34:45.328 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:34:45 compute-0 nova_compute[187212]: 2025-11-25 19:34:45.329 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:45 compute-0 nova_compute[187212]: 2025-11-25 19:34:45.329 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:34:45 compute-0 nova_compute[187212]: 2025-11-25 19:34:45.329 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:34:45 compute-0 nova_compute[187212]: 2025-11-25 19:34:45.330 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:34:45 compute-0 nova_compute[187212]: 2025-11-25 19:34:45.332 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:34:48 compute-0 podman[218628]: 2025-11-25 19:34:48.165089602 +0000 UTC m=+0.082949056 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 19:34:49 compute-0 nova_compute[187212]: 2025-11-25 19:34:49.007 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:49 compute-0 nova_compute[187212]: 2025-11-25 19:34:49.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:50 compute-0 nova_compute[187212]: 2025-11-25 19:34:50.335 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:34:50 compute-0 nova_compute[187212]: 2025-11-25 19:34:50.338 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:34:50 compute-0 nova_compute[187212]: 2025-11-25 19:34:50.338 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:34:50 compute-0 nova_compute[187212]: 2025-11-25 19:34:50.338 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:34:50 compute-0 nova_compute[187212]: 2025-11-25 19:34:50.371 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:50 compute-0 nova_compute[187212]: 2025-11-25 19:34:50.372 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:34:51 compute-0 nova_compute[187212]: 2025-11-25 19:34:51.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:52 compute-0 nova_compute[187212]: 2025-11-25 19:34:52.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:55 compute-0 nova_compute[187212]: 2025-11-25 19:34:55.373 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:34:55 compute-0 nova_compute[187212]: 2025-11-25 19:34:55.375 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:34:55 compute-0 nova_compute[187212]: 2025-11-25 19:34:55.375 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:34:55 compute-0 nova_compute[187212]: 2025-11-25 19:34:55.375 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:34:55 compute-0 nova_compute[187212]: 2025-11-25 19:34:55.408 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:34:55 compute-0 nova_compute[187212]: 2025-11-25 19:34:55.408 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:34:55 compute-0 nova_compute[187212]: 2025-11-25 19:34:55.680 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:55 compute-0 nova_compute[187212]: 2025-11-25 19:34:55.680 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:34:56 compute-0 nova_compute[187212]: 2025-11-25 19:34:56.189 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:34:56 compute-0 nova_compute[187212]: 2025-11-25 19:34:56.190 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:34:56 compute-0 nova_compute[187212]: 2025-11-25 19:34:56.190 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:34:59 compute-0 podman[197585]: time="2025-11-25T19:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:34:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:34:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3553 "" "Go-http-client/1.1"
Nov 25 19:35:00 compute-0 podman[218648]: 2025-11-25 19:35:00.16438775 +0000 UTC m=+0.078653595 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:35:00 compute-0 nova_compute[187212]: 2025-11-25 19:35:00.409 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:00 compute-0 nova_compute[187212]: 2025-11-25 19:35:00.412 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:00 compute-0 nova_compute[187212]: 2025-11-25 19:35:00.412 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:35:00 compute-0 nova_compute[187212]: 2025-11-25 19:35:00.413 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:00 compute-0 nova_compute[187212]: 2025-11-25 19:35:00.445 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:00 compute-0 nova_compute[187212]: 2025-11-25 19:35:00.446 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:01 compute-0 openstack_network_exporter[199731]: ERROR   19:35:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:35:01 compute-0 openstack_network_exporter[199731]: ERROR   19:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:35:01 compute-0 openstack_network_exporter[199731]: ERROR   19:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:35:01 compute-0 openstack_network_exporter[199731]: ERROR   19:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:35:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:35:01 compute-0 openstack_network_exporter[199731]: ERROR   19:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:35:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:35:05 compute-0 nova_compute[187212]: 2025-11-25 19:35:05.447 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:09 compute-0 podman[218674]: 2025-11-25 19:35:09.277118499 +0000 UTC m=+0.186322362 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 25 19:35:10 compute-0 nova_compute[187212]: 2025-11-25 19:35:10.449 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:12 compute-0 podman[218702]: 2025-11-25 19:35:12.15797618 +0000 UTC m=+0.078810359 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:35:15 compute-0 nova_compute[187212]: 2025-11-25 19:35:15.452 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:16 compute-0 podman[218722]: 2025-11-25 19:35:16.168204105 +0000 UTC m=+0.084539557 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 19:35:18 compute-0 sshd-session[218743]: Connection closed by 209.38.103.174 port 53426
Nov 25 19:35:19 compute-0 podman[218745]: 2025-11-25 19:35:19.154880416 +0000 UTC m=+0.073006158 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:35:19 compute-0 sshd-session[218744]: Invalid user admin from 209.38.103.174 port 53440
Nov 25 19:35:19 compute-0 sshd-session[218744]: Connection closed by invalid user admin 209.38.103.174 port 53440 [preauth]
Nov 25 19:35:20 compute-0 nova_compute[187212]: 2025-11-25 19:35:20.453 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:20 compute-0 nova_compute[187212]: 2025-11-25 19:35:20.455 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:20 compute-0 nova_compute[187212]: 2025-11-25 19:35:20.455 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:35:20 compute-0 nova_compute[187212]: 2025-11-25 19:35:20.455 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:20 compute-0 nova_compute[187212]: 2025-11-25 19:35:20.495 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:20 compute-0 nova_compute[187212]: 2025-11-25 19:35:20.496 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:25 compute-0 nova_compute[187212]: 2025-11-25 19:35:25.497 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:25 compute-0 nova_compute[187212]: 2025-11-25 19:35:25.499 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:25 compute-0 nova_compute[187212]: 2025-11-25 19:35:25.499 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:35:25 compute-0 nova_compute[187212]: 2025-11-25 19:35:25.499 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:25 compute-0 nova_compute[187212]: 2025-11-25 19:35:25.554 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:25 compute-0 nova_compute[187212]: 2025-11-25 19:35:25.555 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:29 compute-0 podman[197585]: time="2025-11-25T19:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:35:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:35:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3558 "" "Go-http-client/1.1"
Nov 25 19:35:30 compute-0 nova_compute[187212]: 2025-11-25 19:35:30.555 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:30 compute-0 nova_compute[187212]: 2025-11-25 19:35:30.557 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:35:31.127 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:35:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:35:31.127 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:35:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:35:31.128 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:35:31 compute-0 podman[218766]: 2025-11-25 19:35:31.160778656 +0000 UTC m=+0.077189016 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:35:31 compute-0 openstack_network_exporter[199731]: ERROR   19:35:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:35:31 compute-0 openstack_network_exporter[199731]: ERROR   19:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:35:31 compute-0 openstack_network_exporter[199731]: ERROR   19:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:35:31 compute-0 openstack_network_exporter[199731]: ERROR   19:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:35:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:35:31 compute-0 openstack_network_exporter[199731]: ERROR   19:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:35:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:35:35 compute-0 nova_compute[187212]: 2025-11-25 19:35:35.559 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:35 compute-0 nova_compute[187212]: 2025-11-25 19:35:35.560 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:35 compute-0 nova_compute[187212]: 2025-11-25 19:35:35.561 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:35:35 compute-0 nova_compute[187212]: 2025-11-25 19:35:35.561 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:35 compute-0 nova_compute[187212]: 2025-11-25 19:35:35.561 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:35 compute-0 nova_compute[187212]: 2025-11-25 19:35:35.563 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:37 compute-0 nova_compute[187212]: 2025-11-25 19:35:37.190 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:35:37 compute-0 nova_compute[187212]: 2025-11-25 19:35:37.191 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:35:39 compute-0 nova_compute[187212]: 2025-11-25 19:35:39.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:35:40 compute-0 podman[218792]: 2025-11-25 19:35:40.234931773 +0000 UTC m=+0.150892761 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 19:35:40 compute-0 nova_compute[187212]: 2025-11-25 19:35:40.564 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:41 compute-0 nova_compute[187212]: 2025-11-25 19:35:41.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:35:41 compute-0 nova_compute[187212]: 2025-11-25 19:35:41.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:35:41 compute-0 nova_compute[187212]: 2025-11-25 19:35:41.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:35:41 compute-0 nova_compute[187212]: 2025-11-25 19:35:41.705 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:35:41 compute-0 nova_compute[187212]: 2025-11-25 19:35:41.706 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:35:41 compute-0 nova_compute[187212]: 2025-11-25 19:35:41.706 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:35:41 compute-0 nova_compute[187212]: 2025-11-25 19:35:41.707 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:35:42 compute-0 nova_compute[187212]: 2025-11-25 19:35:42.755 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:35:42 compute-0 nova_compute[187212]: 2025-11-25 19:35:42.833 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:35:42 compute-0 nova_compute[187212]: 2025-11-25 19:35:42.834 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:35:42 compute-0 nova_compute[187212]: 2025-11-25 19:35:42.927 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:35:42 compute-0 nova_compute[187212]: 2025-11-25 19:35:42.932 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:35:43 compute-0 nova_compute[187212]: 2025-11-25 19:35:43.016 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:35:43 compute-0 nova_compute[187212]: 2025-11-25 19:35:43.018 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:35:43 compute-0 nova_compute[187212]: 2025-11-25 19:35:43.084 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:35:43 compute-0 podman[218829]: 2025-11-25 19:35:43.144827369 +0000 UTC m=+0.068455000 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:35:43 compute-0 nova_compute[187212]: 2025-11-25 19:35:43.238 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:35:43 compute-0 nova_compute[187212]: 2025-11-25 19:35:43.239 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:35:43 compute-0 nova_compute[187212]: 2025-11-25 19:35:43.268 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:35:43 compute-0 nova_compute[187212]: 2025-11-25 19:35:43.269 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5433MB free_disk=72.9350471496582GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:35:43 compute-0 nova_compute[187212]: 2025-11-25 19:35:43.269 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:35:43 compute-0 nova_compute[187212]: 2025-11-25 19:35:43.270 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:35:44 compute-0 nova_compute[187212]: 2025-11-25 19:35:44.938 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:35:44 compute-0 nova_compute[187212]: 2025-11-25 19:35:44.939 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:35:44 compute-0 nova_compute[187212]: 2025-11-25 19:35:44.939 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:35:44 compute-0 nova_compute[187212]: 2025-11-25 19:35:44.940 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:35:43 up  1:28,  0 user,  load average: 0.16, 0.18, 0.29\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:35:44 compute-0 nova_compute[187212]: 2025-11-25 19:35:44.992 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.063 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.064 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.082 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.102 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.166 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.565 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.567 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.568 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.568 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.608 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.609 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:45 compute-0 nova_compute[187212]: 2025-11-25 19:35:45.674 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:35:46 compute-0 nova_compute[187212]: 2025-11-25 19:35:46.185 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:35:46 compute-0 nova_compute[187212]: 2025-11-25 19:35:46.185 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.915s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:35:47 compute-0 podman[218852]: 2025-11-25 19:35:47.1787789 +0000 UTC m=+0.103939571 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Nov 25 19:35:50 compute-0 podman[218874]: 2025-11-25 19:35:50.584639802 +0000 UTC m=+0.090159894 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 19:35:50 compute-0 nova_compute[187212]: 2025-11-25 19:35:50.610 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:50 compute-0 nova_compute[187212]: 2025-11-25 19:35:50.611 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:51 compute-0 nova_compute[187212]: 2025-11-25 19:35:51.186 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:35:51 compute-0 nova_compute[187212]: 2025-11-25 19:35:51.697 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:35:51 compute-0 nova_compute[187212]: 2025-11-25 19:35:51.698 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:35:51 compute-0 nova_compute[187212]: 2025-11-25 19:35:51.698 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:35:55 compute-0 nova_compute[187212]: 2025-11-25 19:35:55.612 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:35:55 compute-0 nova_compute[187212]: 2025-11-25 19:35:55.613 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:55 compute-0 nova_compute[187212]: 2025-11-25 19:35:55.613 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:35:55 compute-0 nova_compute[187212]: 2025-11-25 19:35:55.614 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:55 compute-0 nova_compute[187212]: 2025-11-25 19:35:55.614 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:35:55 compute-0 nova_compute[187212]: 2025-11-25 19:35:55.615 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:35:58 compute-0 sshd-session[218895]: Invalid user admin from 209.38.103.174 port 40632
Nov 25 19:35:58 compute-0 sshd-session[218895]: Connection closed by invalid user admin 209.38.103.174 port 40632 [preauth]
Nov 25 19:35:59 compute-0 podman[197585]: time="2025-11-25T19:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:35:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:35:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3551 "" "Go-http-client/1.1"
Nov 25 19:36:00 compute-0 nova_compute[187212]: 2025-11-25 19:36:00.616 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:01 compute-0 anacron[202533]: Job `cron.daily' started
Nov 25 19:36:01 compute-0 anacron[202533]: Job `cron.daily' terminated
Nov 25 19:36:01 compute-0 openstack_network_exporter[199731]: ERROR   19:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:36:01 compute-0 openstack_network_exporter[199731]: ERROR   19:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:36:01 compute-0 openstack_network_exporter[199731]: ERROR   19:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:36:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:36:01 compute-0 openstack_network_exporter[199731]: ERROR   19:36:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:36:01 compute-0 openstack_network_exporter[199731]: ERROR   19:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:36:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:36:02 compute-0 podman[218899]: 2025-11-25 19:36:02.167128002 +0000 UTC m=+0.084776964 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:36:05 compute-0 nova_compute[187212]: 2025-11-25 19:36:05.618 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:10 compute-0 nova_compute[187212]: 2025-11-25 19:36:10.621 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:10 compute-0 podman[218924]: 2025-11-25 19:36:10.782777116 +0000 UTC m=+0.125612535 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:36:14 compute-0 podman[218950]: 2025-11-25 19:36:14.147983611 +0000 UTC m=+0.068700066 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 19:36:15 compute-0 nova_compute[187212]: 2025-11-25 19:36:15.623 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:18 compute-0 podman[218969]: 2025-11-25 19:36:18.158588456 +0000 UTC m=+0.076066897 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 19:36:20 compute-0 nova_compute[187212]: 2025-11-25 19:36:20.625 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:20 compute-0 nova_compute[187212]: 2025-11-25 19:36:20.627 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:21 compute-0 podman[218992]: 2025-11-25 19:36:21.178382837 +0000 UTC m=+0.091297163 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4)
Nov 25 19:36:25 compute-0 nova_compute[187212]: 2025-11-25 19:36:25.628 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:36:25 compute-0 nova_compute[187212]: 2025-11-25 19:36:25.630 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:36:25 compute-0 nova_compute[187212]: 2025-11-25 19:36:25.631 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:36:25 compute-0 nova_compute[187212]: 2025-11-25 19:36:25.631 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:36:25 compute-0 nova_compute[187212]: 2025-11-25 19:36:25.632 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:36:25 compute-0 nova_compute[187212]: 2025-11-25 19:36:25.635 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:29 compute-0 podman[197585]: time="2025-11-25T19:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:36:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:36:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3551 "" "Go-http-client/1.1"
Nov 25 19:36:30 compute-0 nova_compute[187212]: 2025-11-25 19:36:30.635 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:36:31.132 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:36:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:36:31.135 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:36:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:36:31.136 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:36:31 compute-0 openstack_network_exporter[199731]: ERROR   19:36:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:36:31 compute-0 openstack_network_exporter[199731]: ERROR   19:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:36:31 compute-0 openstack_network_exporter[199731]: ERROR   19:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:36:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:36:31 compute-0 openstack_network_exporter[199731]: ERROR   19:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:36:31 compute-0 openstack_network_exporter[199731]: ERROR   19:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:36:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:36:33 compute-0 podman[219014]: 2025-11-25 19:36:33.192221424 +0000 UTC m=+0.106710354 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:36:35 compute-0 nova_compute[187212]: 2025-11-25 19:36:35.637 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:37 compute-0 nova_compute[187212]: 2025-11-25 19:36:37.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:36:37 compute-0 nova_compute[187212]: 2025-11-25 19:36:37.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:36:37 compute-0 sshd-session[219038]: Invalid user admin from 209.38.103.174 port 38302
Nov 25 19:36:37 compute-0 sshd-session[219038]: Connection closed by invalid user admin 209.38.103.174 port 38302 [preauth]
Nov 25 19:36:39 compute-0 nova_compute[187212]: 2025-11-25 19:36:39.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:36:40 compute-0 nova_compute[187212]: 2025-11-25 19:36:40.639 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:36:40 compute-0 nova_compute[187212]: 2025-11-25 19:36:40.641 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:36:40 compute-0 nova_compute[187212]: 2025-11-25 19:36:40.641 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:36:40 compute-0 nova_compute[187212]: 2025-11-25 19:36:40.641 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:36:40 compute-0 nova_compute[187212]: 2025-11-25 19:36:40.683 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:40 compute-0 nova_compute[187212]: 2025-11-25 19:36:40.683 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:36:41 compute-0 nova_compute[187212]: 2025-11-25 19:36:41.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:36:41 compute-0 podman[219040]: 2025-11-25 19:36:41.21910321 +0000 UTC m=+0.136249701 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Nov 25 19:36:41 compute-0 nova_compute[187212]: 2025-11-25 19:36:41.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:36:41 compute-0 nova_compute[187212]: 2025-11-25 19:36:41.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:36:41 compute-0 nova_compute[187212]: 2025-11-25 19:36:41.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:36:41 compute-0 nova_compute[187212]: 2025-11-25 19:36:41.688 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:36:42 compute-0 nova_compute[187212]: 2025-11-25 19:36:42.733 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:36:42 compute-0 nova_compute[187212]: 2025-11-25 19:36:42.826 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:36:42 compute-0 nova_compute[187212]: 2025-11-25 19:36:42.827 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:36:42 compute-0 nova_compute[187212]: 2025-11-25 19:36:42.916 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:36:42 compute-0 nova_compute[187212]: 2025-11-25 19:36:42.928 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:36:43 compute-0 nova_compute[187212]: 2025-11-25 19:36:43.020 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:36:43 compute-0 nova_compute[187212]: 2025-11-25 19:36:43.020 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:36:43 compute-0 nova_compute[187212]: 2025-11-25 19:36:43.104 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:36:43 compute-0 nova_compute[187212]: 2025-11-25 19:36:43.370 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:36:43 compute-0 nova_compute[187212]: 2025-11-25 19:36:43.371 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:36:43 compute-0 nova_compute[187212]: 2025-11-25 19:36:43.410 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:36:43 compute-0 nova_compute[187212]: 2025-11-25 19:36:43.411 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5416MB free_disk=72.93510437011719GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:36:43 compute-0 nova_compute[187212]: 2025-11-25 19:36:43.411 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:36:43 compute-0 nova_compute[187212]: 2025-11-25 19:36:43.411 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:36:44 compute-0 nova_compute[187212]: 2025-11-25 19:36:44.975 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:36:44 compute-0 nova_compute[187212]: 2025-11-25 19:36:44.975 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:36:44 compute-0 nova_compute[187212]: 2025-11-25 19:36:44.975 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:36:44 compute-0 nova_compute[187212]: 2025-11-25 19:36:44.976 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:36:43 up  1:29,  0 user,  load average: 0.06, 0.14, 0.27\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:36:45 compute-0 nova_compute[187212]: 2025-11-25 19:36:45.086 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:36:45 compute-0 podman[219080]: 2025-11-25 19:36:45.163788522 +0000 UTC m=+0.079488557 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 19:36:45 compute-0 nova_compute[187212]: 2025-11-25 19:36:45.594 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:36:45 compute-0 nova_compute[187212]: 2025-11-25 19:36:45.684 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:36:45 compute-0 nova_compute[187212]: 2025-11-25 19:36:45.686 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:45 compute-0 nova_compute[187212]: 2025-11-25 19:36:45.686 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:36:45 compute-0 nova_compute[187212]: 2025-11-25 19:36:45.686 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:36:45 compute-0 nova_compute[187212]: 2025-11-25 19:36:45.687 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:36:45 compute-0 nova_compute[187212]: 2025-11-25 19:36:45.689 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:46 compute-0 nova_compute[187212]: 2025-11-25 19:36:46.110 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:36:46 compute-0 nova_compute[187212]: 2025-11-25 19:36:46.110 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.699s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:36:47 compute-0 nova_compute[187212]: 2025-11-25 19:36:47.111 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:36:47 compute-0 nova_compute[187212]: 2025-11-25 19:36:47.113 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:36:48 compute-0 nova_compute[187212]: 2025-11-25 19:36:48.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:36:49 compute-0 podman[219099]: 2025-11-25 19:36:49.156287486 +0000 UTC m=+0.075069782 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 25 19:36:50 compute-0 nova_compute[187212]: 2025-11-25 19:36:50.687 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:52 compute-0 systemd[1]: Starting dnf makecache...
Nov 25 19:36:52 compute-0 podman[219120]: 2025-11-25 19:36:52.172868064 +0000 UTC m=+0.093412898 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 19:36:52 compute-0 nova_compute[187212]: 2025-11-25 19:36:52.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:36:52 compute-0 dnf[219121]: Metadata cache refreshed recently.
Nov 25 19:36:52 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 25 19:36:52 compute-0 systemd[1]: Finished dnf makecache.
Nov 25 19:36:53 compute-0 nova_compute[187212]: 2025-11-25 19:36:53.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:36:55 compute-0 nova_compute[187212]: 2025-11-25 19:36:55.689 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:36:59 compute-0 podman[197585]: time="2025-11-25T19:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:36:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:36:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3547 "" "Go-http-client/1.1"
Nov 25 19:37:00 compute-0 nova_compute[187212]: 2025-11-25 19:37:00.691 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:01 compute-0 openstack_network_exporter[199731]: ERROR   19:37:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:37:01 compute-0 openstack_network_exporter[199731]: ERROR   19:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:37:01 compute-0 openstack_network_exporter[199731]: ERROR   19:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:37:01 compute-0 openstack_network_exporter[199731]: ERROR   19:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:37:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:37:01 compute-0 openstack_network_exporter[199731]: ERROR   19:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:37:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:37:04 compute-0 podman[219141]: 2025-11-25 19:37:04.15205574 +0000 UTC m=+0.075939704 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:37:05 compute-0 nova_compute[187212]: 2025-11-25 19:37:05.693 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:10 compute-0 nova_compute[187212]: 2025-11-25 19:37:10.695 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:10 compute-0 nova_compute[187212]: 2025-11-25 19:37:10.698 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:10 compute-0 nova_compute[187212]: 2025-11-25 19:37:10.698 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:37:10 compute-0 nova_compute[187212]: 2025-11-25 19:37:10.698 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:37:10 compute-0 nova_compute[187212]: 2025-11-25 19:37:10.699 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:37:10 compute-0 nova_compute[187212]: 2025-11-25 19:37:10.700 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:12 compute-0 podman[219165]: 2025-11-25 19:37:12.159289016 +0000 UTC m=+0.089049555 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Nov 25 19:37:15 compute-0 nova_compute[187212]: 2025-11-25 19:37:15.700 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:16 compute-0 podman[219191]: 2025-11-25 19:37:16.134127691 +0000 UTC m=+0.058783698 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 19:37:18 compute-0 sshd-session[219210]: Invalid user admin from 209.38.103.174 port 56400
Nov 25 19:37:18 compute-0 sshd-session[219210]: Connection closed by invalid user admin 209.38.103.174 port 56400 [preauth]
Nov 25 19:37:20 compute-0 podman[219212]: 2025-11-25 19:37:20.178090132 +0000 UTC m=+0.094915007 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Nov 25 19:37:20 compute-0 nova_compute[187212]: 2025-11-25 19:37:20.703 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:23 compute-0 podman[219233]: 2025-11-25 19:37:23.17774774 +0000 UTC m=+0.101828107 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Nov 25 19:37:25 compute-0 nova_compute[187212]: 2025-11-25 19:37:25.704 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:29 compute-0 podman[197585]: time="2025-11-25T19:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:37:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:37:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3555 "" "Go-http-client/1.1"
Nov 25 19:37:30 compute-0 nova_compute[187212]: 2025-11-25 19:37:30.705 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:30 compute-0 nova_compute[187212]: 2025-11-25 19:37:30.707 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:30 compute-0 nova_compute[187212]: 2025-11-25 19:37:30.708 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:37:30 compute-0 nova_compute[187212]: 2025-11-25 19:37:30.708 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:37:30 compute-0 nova_compute[187212]: 2025-11-25 19:37:30.743 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:30 compute-0 nova_compute[187212]: 2025-11-25 19:37:30.744 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:37:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:37:31.137 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:37:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:37:31.137 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:37:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:37:31.139 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:37:31 compute-0 openstack_network_exporter[199731]: ERROR   19:37:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:37:31 compute-0 openstack_network_exporter[199731]: ERROR   19:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:37:31 compute-0 openstack_network_exporter[199731]: ERROR   19:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:37:31 compute-0 openstack_network_exporter[199731]: ERROR   19:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:37:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:37:31 compute-0 openstack_network_exporter[199731]: ERROR   19:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:37:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:37:35 compute-0 podman[219254]: 2025-11-25 19:37:35.169504603 +0000 UTC m=+0.094571588 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:37:35 compute-0 nova_compute[187212]: 2025-11-25 19:37:35.744 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:35 compute-0 nova_compute[187212]: 2025-11-25 19:37:35.745 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:35 compute-0 nova_compute[187212]: 2025-11-25 19:37:35.746 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:37:35 compute-0 nova_compute[187212]: 2025-11-25 19:37:35.746 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:37:35 compute-0 nova_compute[187212]: 2025-11-25 19:37:35.746 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:37:35 compute-0 nova_compute[187212]: 2025-11-25 19:37:35.747 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:38 compute-0 nova_compute[187212]: 2025-11-25 19:37:38.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:37:38 compute-0 nova_compute[187212]: 2025-11-25 19:37:38.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:37:39 compute-0 nova_compute[187212]: 2025-11-25 19:37:39.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:37:40 compute-0 nova_compute[187212]: 2025-11-25 19:37:40.749 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:40 compute-0 nova_compute[187212]: 2025-11-25 19:37:40.750 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:42 compute-0 nova_compute[187212]: 2025-11-25 19:37:42.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:37:43 compute-0 nova_compute[187212]: 2025-11-25 19:37:43.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:37:43 compute-0 nova_compute[187212]: 2025-11-25 19:37:43.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:37:43 compute-0 podman[219278]: 2025-11-25 19:37:43.28491192 +0000 UTC m=+0.190718466 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Nov 25 19:37:43 compute-0 nova_compute[187212]: 2025-11-25 19:37:43.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:37:43 compute-0 nova_compute[187212]: 2025-11-25 19:37:43.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:37:43 compute-0 nova_compute[187212]: 2025-11-25 19:37:43.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:37:43 compute-0 nova_compute[187212]: 2025-11-25 19:37:43.690 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:37:44 compute-0 nova_compute[187212]: 2025-11-25 19:37:44.737 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:37:44 compute-0 nova_compute[187212]: 2025-11-25 19:37:44.791 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:37:44 compute-0 nova_compute[187212]: 2025-11-25 19:37:44.792 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:37:44 compute-0 nova_compute[187212]: 2025-11-25 19:37:44.845 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:37:44 compute-0 nova_compute[187212]: 2025-11-25 19:37:44.851 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:37:44 compute-0 nova_compute[187212]: 2025-11-25 19:37:44.904 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:37:44 compute-0 nova_compute[187212]: 2025-11-25 19:37:44.905 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:37:44 compute-0 nova_compute[187212]: 2025-11-25 19:37:44.956 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.150 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.152 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.172 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.172 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5380MB free_disk=72.93512344360352GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.173 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.173 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.750 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.751 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.752 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.752 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.754 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:45 compute-0 nova_compute[187212]: 2025-11-25 19:37:45.755 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:37:46 compute-0 nova_compute[187212]: 2025-11-25 19:37:46.750 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:37:46 compute-0 nova_compute[187212]: 2025-11-25 19:37:46.751 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:37:46 compute-0 nova_compute[187212]: 2025-11-25 19:37:46.752 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:37:46 compute-0 nova_compute[187212]: 2025-11-25 19:37:46.752 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:37:45 up  1:30,  0 user,  load average: 0.02, 0.11, 0.25\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:37:46 compute-0 nova_compute[187212]: 2025-11-25 19:37:46.803 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:37:47 compute-0 podman[219319]: 2025-11-25 19:37:47.162135638 +0000 UTC m=+0.087880693 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Nov 25 19:37:47 compute-0 nova_compute[187212]: 2025-11-25 19:37:47.310 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:37:47 compute-0 nova_compute[187212]: 2025-11-25 19:37:47.820 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:37:47 compute-0 nova_compute[187212]: 2025-11-25 19:37:47.821 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.648s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:37:50 compute-0 nova_compute[187212]: 2025-11-25 19:37:50.756 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:50 compute-0 nova_compute[187212]: 2025-11-25 19:37:50.757 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:51 compute-0 podman[219340]: 2025-11-25 19:37:51.181898021 +0000 UTC m=+0.096753975 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 19:37:52 compute-0 nova_compute[187212]: 2025-11-25 19:37:52.817 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:37:53 compute-0 nova_compute[187212]: 2025-11-25 19:37:53.331 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:37:53 compute-0 nova_compute[187212]: 2025-11-25 19:37:53.331 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:37:54 compute-0 podman[219363]: 2025-11-25 19:37:54.167669549 +0000 UTC m=+0.082361961 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 19:37:54 compute-0 nova_compute[187212]: 2025-11-25 19:37:54.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:37:55 compute-0 nova_compute[187212]: 2025-11-25 19:37:55.758 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:37:55 compute-0 nova_compute[187212]: 2025-11-25 19:37:55.760 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:37:59 compute-0 podman[197585]: time="2025-11-25T19:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:37:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:37:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3552 "" "Go-http-client/1.1"
Nov 25 19:38:00 compute-0 sshd-session[219383]: Invalid user admin from 209.38.103.174 port 47456
Nov 25 19:38:00 compute-0 sshd-session[219383]: Connection closed by invalid user admin 209.38.103.174 port 47456 [preauth]
Nov 25 19:38:00 compute-0 nova_compute[187212]: 2025-11-25 19:38:00.760 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:00 compute-0 nova_compute[187212]: 2025-11-25 19:38:00.762 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:01 compute-0 openstack_network_exporter[199731]: ERROR   19:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:38:01 compute-0 openstack_network_exporter[199731]: ERROR   19:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:38:01 compute-0 openstack_network_exporter[199731]: ERROR   19:38:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:38:01 compute-0 openstack_network_exporter[199731]: ERROR   19:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:38:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:38:01 compute-0 openstack_network_exporter[199731]: ERROR   19:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:38:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:38:05 compute-0 nova_compute[187212]: 2025-11-25 19:38:05.764 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:05 compute-0 nova_compute[187212]: 2025-11-25 19:38:05.766 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:06 compute-0 podman[219385]: 2025-11-25 19:38:06.180649353 +0000 UTC m=+0.106127048 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:38:10 compute-0 nova_compute[187212]: 2025-11-25 19:38:10.767 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:10 compute-0 nova_compute[187212]: 2025-11-25 19:38:10.768 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:14 compute-0 podman[219410]: 2025-11-25 19:38:14.202644834 +0000 UTC m=+0.124632059 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Nov 25 19:38:15 compute-0 nova_compute[187212]: 2025-11-25 19:38:15.769 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:15 compute-0 nova_compute[187212]: 2025-11-25 19:38:15.770 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:15 compute-0 nova_compute[187212]: 2025-11-25 19:38:15.770 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:38:15 compute-0 nova_compute[187212]: 2025-11-25 19:38:15.770 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:38:15 compute-0 nova_compute[187212]: 2025-11-25 19:38:15.771 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:38:15 compute-0 nova_compute[187212]: 2025-11-25 19:38:15.772 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:18 compute-0 podman[219437]: 2025-11-25 19:38:18.174756708 +0000 UTC m=+0.090459401 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 19:38:20 compute-0 nova_compute[187212]: 2025-11-25 19:38:20.774 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:20 compute-0 nova_compute[187212]: 2025-11-25 19:38:20.817 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:20 compute-0 nova_compute[187212]: 2025-11-25 19:38:20.817 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:38:20 compute-0 nova_compute[187212]: 2025-11-25 19:38:20.818 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:38:20 compute-0 nova_compute[187212]: 2025-11-25 19:38:20.818 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:38:20 compute-0 nova_compute[187212]: 2025-11-25 19:38:20.820 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:22 compute-0 podman[219458]: 2025-11-25 19:38:22.143804913 +0000 UTC m=+0.069262110 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container)
Nov 25 19:38:25 compute-0 podman[219480]: 2025-11-25 19:38:25.178014869 +0000 UTC m=+0.097393052 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Nov 25 19:38:25 compute-0 nova_compute[187212]: 2025-11-25 19:38:25.820 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:29 compute-0 podman[197585]: time="2025-11-25T19:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:38:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:38:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3556 "" "Go-http-client/1.1"
Nov 25 19:38:30 compute-0 nova_compute[187212]: 2025-11-25 19:38:30.823 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:30 compute-0 nova_compute[187212]: 2025-11-25 19:38:30.824 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:30 compute-0 nova_compute[187212]: 2025-11-25 19:38:30.825 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:38:30 compute-0 nova_compute[187212]: 2025-11-25 19:38:30.825 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:38:30 compute-0 nova_compute[187212]: 2025-11-25 19:38:30.825 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:30 compute-0 nova_compute[187212]: 2025-11-25 19:38:30.826 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:38:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:38:31.142 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:38:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:38:31.143 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:38:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:38:31.144 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:38:31 compute-0 openstack_network_exporter[199731]: ERROR   19:38:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:38:31 compute-0 openstack_network_exporter[199731]: ERROR   19:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:38:31 compute-0 openstack_network_exporter[199731]: ERROR   19:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:38:31 compute-0 openstack_network_exporter[199731]: ERROR   19:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:38:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:38:31 compute-0 openstack_network_exporter[199731]: ERROR   19:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:38:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:38:35 compute-0 nova_compute[187212]: 2025-11-25 19:38:35.829 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:37 compute-0 podman[219502]: 2025-11-25 19:38:37.159749162 +0000 UTC m=+0.084151088 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:38:39 compute-0 nova_compute[187212]: 2025-11-25 19:38:39.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:38:40 compute-0 nova_compute[187212]: 2025-11-25 19:38:40.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:38:40 compute-0 nova_compute[187212]: 2025-11-25 19:38:40.175 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:38:40 compute-0 nova_compute[187212]: 2025-11-25 19:38:40.832 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:40 compute-0 nova_compute[187212]: 2025-11-25 19:38:40.835 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:38:40 compute-0 nova_compute[187212]: 2025-11-25 19:38:40.835 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:38:40 compute-0 nova_compute[187212]: 2025-11-25 19:38:40.836 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:38:40 compute-0 nova_compute[187212]: 2025-11-25 19:38:40.867 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:40 compute-0 nova_compute[187212]: 2025-11-25 19:38:40.868 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:38:44 compute-0 nova_compute[187212]: 2025-11-25 19:38:44.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:38:44 compute-0 nova_compute[187212]: 2025-11-25 19:38:44.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:38:44 compute-0 nova_compute[187212]: 2025-11-25 19:38:44.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:38:44 compute-0 nova_compute[187212]: 2025-11-25 19:38:44.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:38:44 compute-0 nova_compute[187212]: 2025-11-25 19:38:44.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:38:44 compute-0 nova_compute[187212]: 2025-11-25 19:38:44.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:38:44 compute-0 nova_compute[187212]: 2025-11-25 19:38:44.691 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:38:44 compute-0 sshd-session[219528]: Invalid user admin from 209.38.103.174 port 53082
Nov 25 19:38:44 compute-0 sshd-session[219528]: Connection closed by invalid user admin 209.38.103.174 port 53082 [preauth]
Nov 25 19:38:44 compute-0 podman[219530]: 2025-11-25 19:38:44.999208986 +0000 UTC m=+0.128439098 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 19:38:45 compute-0 nova_compute[187212]: 2025-11-25 19:38:45.744 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:38:45 compute-0 nova_compute[187212]: 2025-11-25 19:38:45.829 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:38:45 compute-0 nova_compute[187212]: 2025-11-25 19:38:45.830 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:38:45 compute-0 nova_compute[187212]: 2025-11-25 19:38:45.869 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:45 compute-0 nova_compute[187212]: 2025-11-25 19:38:45.910 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:38:45 compute-0 nova_compute[187212]: 2025-11-25 19:38:45.917 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:38:45 compute-0 nova_compute[187212]: 2025-11-25 19:38:45.985 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:38:45 compute-0 nova_compute[187212]: 2025-11-25 19:38:45.986 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:38:46 compute-0 nova_compute[187212]: 2025-11-25 19:38:46.040 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:38:46 compute-0 nova_compute[187212]: 2025-11-25 19:38:46.263 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:38:46 compute-0 nova_compute[187212]: 2025-11-25 19:38:46.265 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:38:46 compute-0 nova_compute[187212]: 2025-11-25 19:38:46.294 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:38:46 compute-0 nova_compute[187212]: 2025-11-25 19:38:46.295 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5371MB free_disk=72.93512344360352GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:38:46 compute-0 nova_compute[187212]: 2025-11-25 19:38:46.296 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:38:46 compute-0 nova_compute[187212]: 2025-11-25 19:38:46.296 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:38:47 compute-0 nova_compute[187212]: 2025-11-25 19:38:47.866 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:38:47 compute-0 nova_compute[187212]: 2025-11-25 19:38:47.866 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:38:47 compute-0 nova_compute[187212]: 2025-11-25 19:38:47.867 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:38:47 compute-0 nova_compute[187212]: 2025-11-25 19:38:47.867 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:38:46 up  1:31,  0 user,  load average: 0.27, 0.17, 0.26\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:38:47 compute-0 nova_compute[187212]: 2025-11-25 19:38:47.934 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:38:48 compute-0 nova_compute[187212]: 2025-11-25 19:38:48.442 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:38:48 compute-0 nova_compute[187212]: 2025-11-25 19:38:48.956 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:38:48 compute-0 nova_compute[187212]: 2025-11-25 19:38:48.957 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.661s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:38:49 compute-0 podman[219569]: 2025-11-25 19:38:49.172715193 +0000 UTC m=+0.091450567 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 25 19:38:50 compute-0 nova_compute[187212]: 2025-11-25 19:38:50.872 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:53 compute-0 podman[219589]: 2025-11-25 19:38:53.129453579 +0000 UTC m=+0.056732815 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 25 19:38:55 compute-0 nova_compute[187212]: 2025-11-25 19:38:55.875 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:38:56 compute-0 podman[219611]: 2025-11-25 19:38:56.167974997 +0000 UTC m=+0.088901921 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Nov 25 19:38:56 compute-0 nova_compute[187212]: 2025-11-25 19:38:56.957 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:38:56 compute-0 nova_compute[187212]: 2025-11-25 19:38:56.958 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:38:56 compute-0 nova_compute[187212]: 2025-11-25 19:38:56.958 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:38:59 compute-0 podman[197585]: time="2025-11-25T19:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:38:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:38:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3551 "" "Go-http-client/1.1"
Nov 25 19:39:00 compute-0 nova_compute[187212]: 2025-11-25 19:39:00.877 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:39:00 compute-0 nova_compute[187212]: 2025-11-25 19:39:00.880 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:39:00 compute-0 nova_compute[187212]: 2025-11-25 19:39:00.880 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:39:00 compute-0 nova_compute[187212]: 2025-11-25 19:39:00.880 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:39:00 compute-0 nova_compute[187212]: 2025-11-25 19:39:00.922 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:00 compute-0 nova_compute[187212]: 2025-11-25 19:39:00.922 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:39:01 compute-0 openstack_network_exporter[199731]: ERROR   19:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:39:01 compute-0 openstack_network_exporter[199731]: ERROR   19:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:39:01 compute-0 openstack_network_exporter[199731]: ERROR   19:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:39:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:39:01 compute-0 openstack_network_exporter[199731]: ERROR   19:39:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:39:01 compute-0 openstack_network_exporter[199731]: ERROR   19:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:39:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:39:05 compute-0 nova_compute[187212]: 2025-11-25 19:39:05.923 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:05 compute-0 nova_compute[187212]: 2025-11-25 19:39:05.926 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:39:08 compute-0 podman[219631]: 2025-11-25 19:39:08.139639569 +0000 UTC m=+0.063379398 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:39:08 compute-0 nova_compute[187212]: 2025-11-25 19:39:08.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:08 compute-0 nova_compute[187212]: 2025-11-25 19:39:08.174 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:39:08 compute-0 nova_compute[187212]: 2025-11-25 19:39:08.175 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:39:08 compute-0 nova_compute[187212]: 2025-11-25 19:39:08.175 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:39:08 compute-0 nova_compute[187212]: 2025-11-25 19:39:08.175 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:39:08 compute-0 nova_compute[187212]: 2025-11-25 19:39:08.175 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:39:08 compute-0 nova_compute[187212]: 2025-11-25 19:39:08.176 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.194 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.194 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Image id 5ca774a8-6150-424f-aaca-03ab3a3ee8cf yields fingerprint 1c0eb12bfe5dbef092d49128b5539724adaa8730 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.194 187216 INFO nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] image 5ca774a8-6150-424f-aaca-03ab3a3ee8cf at (/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730): checking
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.194 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] image 5ca774a8-6150-424f-aaca-03ab3a3ee8cf at (/var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730): image is in use _mark_in_use /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:279
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.198 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.200 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] f71d9429-2da3-4b6b-b82d-63027e46f952 is a valid instance name _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:126
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.200 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] f71d9429-2da3-4b6b-b82d-63027e46f952 has a disk file _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:129
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.201 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.276 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.278 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance f71d9429-2da3-4b6b-b82d-63027e46f952 is backed by 1c0eb12bfe5dbef092d49128b5539724adaa8730 _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:141
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.278 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] 7b272b07-af4e-48a9-982b-25888fa2f334 is a valid instance name _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:126
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.278 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] 7b272b07-af4e-48a9-982b-25888fa2f334 has a disk file _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:129
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.279 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.332 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.333 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 is backed by 1c0eb12bfe5dbef092d49128b5539724adaa8730 _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:141
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.333 187216 INFO nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Active base files: /var/lib/nova/instances/_base/1c0eb12bfe5dbef092d49128b5539724adaa8730
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.333 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.334 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Nov 25 19:39:09 compute-0 nova_compute[187212]: 2025-11-25 19:39:09.334 187216 DEBUG nova.virt.libvirt.imagecache [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Nov 25 19:39:10 compute-0 nova_compute[187212]: 2025-11-25 19:39:10.926 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:15 compute-0 podman[219662]: 2025-11-25 19:39:15.208893824 +0000 UTC m=+0.127246747 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 25 19:39:15 compute-0 nova_compute[187212]: 2025-11-25 19:39:15.929 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:20 compute-0 podman[219689]: 2025-11-25 19:39:20.158671991 +0000 UTC m=+0.074415845 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:39:20 compute-0 nova_compute[187212]: 2025-11-25 19:39:20.931 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:39:20 compute-0 nova_compute[187212]: 2025-11-25 19:39:20.932 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:20 compute-0 nova_compute[187212]: 2025-11-25 19:39:20.932 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:39:20 compute-0 nova_compute[187212]: 2025-11-25 19:39:20.933 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:39:20 compute-0 nova_compute[187212]: 2025-11-25 19:39:20.933 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:39:20 compute-0 nova_compute[187212]: 2025-11-25 19:39:20.934 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:24 compute-0 podman[219709]: 2025-11-25 19:39:24.199571001 +0000 UTC m=+0.116396895 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 19:39:25 compute-0 nova_compute[187212]: 2025-11-25 19:39:25.934 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:27 compute-0 podman[219730]: 2025-11-25 19:39:27.181347036 +0000 UTC m=+0.101381715 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:39:28 compute-0 sshd-session[219750]: Invalid user admin from 209.38.103.174 port 55076
Nov 25 19:39:28 compute-0 sshd-session[219750]: Connection closed by invalid user admin 209.38.103.174 port 55076 [preauth]
Nov 25 19:39:29 compute-0 podman[197585]: time="2025-11-25T19:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:39:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:39:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3551 "" "Go-http-client/1.1"
Nov 25 19:39:30 compute-0 nova_compute[187212]: 2025-11-25 19:39:30.936 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:39:30 compute-0 nova_compute[187212]: 2025-11-25 19:39:30.938 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:39:30 compute-0 nova_compute[187212]: 2025-11-25 19:39:30.938 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:39:30 compute-0 nova_compute[187212]: 2025-11-25 19:39:30.939 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:39:30 compute-0 nova_compute[187212]: 2025-11-25 19:39:30.966 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:30 compute-0 nova_compute[187212]: 2025-11-25 19:39:30.967 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:39:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:39:31.145 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:39:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:39:31.146 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:39:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:39:31.146 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:39:31 compute-0 openstack_network_exporter[199731]: ERROR   19:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:39:31 compute-0 openstack_network_exporter[199731]: ERROR   19:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:39:31 compute-0 openstack_network_exporter[199731]: ERROR   19:39:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:39:31 compute-0 openstack_network_exporter[199731]: ERROR   19:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:39:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:39:31 compute-0 openstack_network_exporter[199731]: ERROR   19:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:39:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:39:35 compute-0 nova_compute[187212]: 2025-11-25 19:39:35.968 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:39 compute-0 podman[219753]: 2025-11-25 19:39:39.152065111 +0000 UTC m=+0.078342366 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:39:40 compute-0 nova_compute[187212]: 2025-11-25 19:39:40.973 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:41 compute-0 nova_compute[187212]: 2025-11-25 19:39:41.335 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:42 compute-0 nova_compute[187212]: 2025-11-25 19:39:42.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:42 compute-0 nova_compute[187212]: 2025-11-25 19:39:42.176 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:39:45 compute-0 nova_compute[187212]: 2025-11-25 19:39:45.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:45 compute-0 nova_compute[187212]: 2025-11-25 19:39:45.975 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:46 compute-0 nova_compute[187212]: 2025-11-25 19:39:46.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:46 compute-0 nova_compute[187212]: 2025-11-25 19:39:46.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:46 compute-0 podman[219777]: 2025-11-25 19:39:46.258053891 +0000 UTC m=+0.169522696 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Nov 25 19:39:46 compute-0 nova_compute[187212]: 2025-11-25 19:39:46.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:39:46 compute-0 nova_compute[187212]: 2025-11-25 19:39:46.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:39:46 compute-0 nova_compute[187212]: 2025-11-25 19:39:46.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:39:46 compute-0 nova_compute[187212]: 2025-11-25 19:39:46.688 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:39:47 compute-0 nova_compute[187212]: 2025-11-25 19:39:47.737 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:39:47 compute-0 nova_compute[187212]: 2025-11-25 19:39:47.823 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:39:47 compute-0 nova_compute[187212]: 2025-11-25 19:39:47.825 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:39:47 compute-0 nova_compute[187212]: 2025-11-25 19:39:47.889 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:39:47 compute-0 nova_compute[187212]: 2025-11-25 19:39:47.895 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:39:47 compute-0 nova_compute[187212]: 2025-11-25 19:39:47.954 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:39:47 compute-0 nova_compute[187212]: 2025-11-25 19:39:47.955 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:39:48 compute-0 nova_compute[187212]: 2025-11-25 19:39:48.025 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:39:48 compute-0 nova_compute[187212]: 2025-11-25 19:39:48.234 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:39:48 compute-0 nova_compute[187212]: 2025-11-25 19:39:48.235 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:39:48 compute-0 nova_compute[187212]: 2025-11-25 19:39:48.271 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:39:48 compute-0 nova_compute[187212]: 2025-11-25 19:39:48.271 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5367MB free_disk=72.93510055541992GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:39:48 compute-0 nova_compute[187212]: 2025-11-25 19:39:48.272 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:39:48 compute-0 nova_compute[187212]: 2025-11-25 19:39:48.272 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:39:50 compute-0 nova_compute[187212]: 2025-11-25 19:39:50.269 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:39:50 compute-0 nova_compute[187212]: 2025-11-25 19:39:50.269 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:39:50 compute-0 nova_compute[187212]: 2025-11-25 19:39:50.269 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:39:50 compute-0 nova_compute[187212]: 2025-11-25 19:39:50.270 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:39:48 up  1:32,  0 user,  load average: 0.19, 0.16, 0.25\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:39:50 compute-0 nova_compute[187212]: 2025-11-25 19:39:50.331 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:39:50 compute-0 nova_compute[187212]: 2025-11-25 19:39:50.840 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:39:51 compute-0 nova_compute[187212]: 2025-11-25 19:39:51.033 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:39:51 compute-0 podman[219817]: 2025-11-25 19:39:51.139269526 +0000 UTC m=+0.061211231 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 19:39:51 compute-0 nova_compute[187212]: 2025-11-25 19:39:51.352 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:39:51 compute-0 nova_compute[187212]: 2025-11-25 19:39:51.352 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.080s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:39:55 compute-0 podman[219836]: 2025-11-25 19:39:55.185384014 +0000 UTC m=+0.100164824 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal)
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.035 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.037 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.037 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.038 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.038 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.040 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.687 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.688 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:56 compute-0 nova_compute[187212]: 2025-11-25 19:39:56.688 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:39:57 compute-0 nova_compute[187212]: 2025-11-25 19:39:57.686 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:57 compute-0 nova_compute[187212]: 2025-11-25 19:39:57.687 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:39:58 compute-0 podman[219857]: 2025-11-25 19:39:58.177340375 +0000 UTC m=+0.095185588 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 19:39:59 compute-0 podman[197585]: time="2025-11-25T19:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:39:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:39:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3559 "" "Go-http-client/1.1"
Nov 25 19:40:01 compute-0 nova_compute[187212]: 2025-11-25 19:40:01.040 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:40:01 compute-0 nova_compute[187212]: 2025-11-25 19:40:01.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:01 compute-0 openstack_network_exporter[199731]: ERROR   19:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:40:01 compute-0 openstack_network_exporter[199731]: ERROR   19:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:40:01 compute-0 openstack_network_exporter[199731]: ERROR   19:40:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:40:01 compute-0 openstack_network_exporter[199731]: ERROR   19:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:40:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:40:01 compute-0 openstack_network_exporter[199731]: ERROR   19:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:40:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:40:06 compute-0 nova_compute[187212]: 2025-11-25 19:40:06.042 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:40:06 compute-0 nova_compute[187212]: 2025-11-25 19:40:06.043 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:40:06 compute-0 nova_compute[187212]: 2025-11-25 19:40:06.043 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:40:06 compute-0 nova_compute[187212]: 2025-11-25 19:40:06.043 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:40:06 compute-0 nova_compute[187212]: 2025-11-25 19:40:06.044 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:40:06 compute-0 nova_compute[187212]: 2025-11-25 19:40:06.045 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:40:08 compute-0 nova_compute[187212]: 2025-11-25 19:40:08.690 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:08 compute-0 nova_compute[187212]: 2025-11-25 19:40:08.691 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:40:09 compute-0 nova_compute[187212]: 2025-11-25 19:40:09.203 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:40:09 compute-0 nova_compute[187212]: 2025-11-25 19:40:09.417 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:09 compute-0 nova_compute[187212]: 2025-11-25 19:40:09.930 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Triggering sync for uuid 7b272b07-af4e-48a9-982b-25888fa2f334 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Nov 25 19:40:09 compute-0 nova_compute[187212]: 2025-11-25 19:40:09.931 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Triggering sync for uuid f71d9429-2da3-4b6b-b82d-63027e46f952 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Nov 25 19:40:09 compute-0 nova_compute[187212]: 2025-11-25 19:40:09.932 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "7b272b07-af4e-48a9-982b-25888fa2f334" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:40:09 compute-0 nova_compute[187212]: 2025-11-25 19:40:09.932 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "7b272b07-af4e-48a9-982b-25888fa2f334" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:40:09 compute-0 nova_compute[187212]: 2025-11-25 19:40:09.933 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "f71d9429-2da3-4b6b-b82d-63027e46f952" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:40:09 compute-0 nova_compute[187212]: 2025-11-25 19:40:09.934 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:40:10 compute-0 podman[219878]: 2025-11-25 19:40:10.153863436 +0000 UTC m=+0.072900109 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:40:10 compute-0 nova_compute[187212]: 2025-11-25 19:40:10.447 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "7b272b07-af4e-48a9-982b-25888fa2f334" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.514s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:40:10 compute-0 nova_compute[187212]: 2025-11-25 19:40:10.449 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.515s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:40:10 compute-0 sshd-session[219903]: Invalid user backup from 209.38.103.174 port 56226
Nov 25 19:40:10 compute-0 sshd-session[219903]: Connection closed by invalid user backup 209.38.103.174 port 56226 [preauth]
Nov 25 19:40:11 compute-0 nova_compute[187212]: 2025-11-25 19:40:11.045 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:40:16 compute-0 nova_compute[187212]: 2025-11-25 19:40:16.048 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:40:17 compute-0 podman[219905]: 2025-11-25 19:40:17.220141878 +0000 UTC m=+0.142810078 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:40:21 compute-0 nova_compute[187212]: 2025-11-25 19:40:21.050 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:40:22 compute-0 podman[219932]: 2025-11-25 19:40:22.142702416 +0000 UTC m=+0.066028767 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:40:26 compute-0 nova_compute[187212]: 2025-11-25 19:40:26.087 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:40:26 compute-0 nova_compute[187212]: 2025-11-25 19:40:26.089 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:40:26 compute-0 nova_compute[187212]: 2025-11-25 19:40:26.090 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:40:26 compute-0 nova_compute[187212]: 2025-11-25 19:40:26.090 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:40:26 compute-0 nova_compute[187212]: 2025-11-25 19:40:26.090 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:40:26 compute-0 nova_compute[187212]: 2025-11-25 19:40:26.092 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:40:26 compute-0 podman[219952]: 2025-11-25 19:40:26.224954853 +0000 UTC m=+0.106406375 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64)
Nov 25 19:40:29 compute-0 podman[219974]: 2025-11-25 19:40:29.170724594 +0000 UTC m=+0.097609022 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 19:40:29 compute-0 podman[197585]: time="2025-11-25T19:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:40:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:40:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3552 "" "Go-http-client/1.1"
Nov 25 19:40:31 compute-0 nova_compute[187212]: 2025-11-25 19:40:31.092 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:40:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:40:31.147 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:40:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:40:31.148 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:40:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:40:31.148 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:40:31 compute-0 openstack_network_exporter[199731]: ERROR   19:40:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:40:31 compute-0 openstack_network_exporter[199731]: ERROR   19:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:40:31 compute-0 openstack_network_exporter[199731]: ERROR   19:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:40:31 compute-0 openstack_network_exporter[199731]: ERROR   19:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:40:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:40:31 compute-0 openstack_network_exporter[199731]: ERROR   19:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:40:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:40:36 compute-0 nova_compute[187212]: 2025-11-25 19:40:36.095 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:40:40 compute-0 nova_compute[187212]: 2025-11-25 19:40:40.690 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:40 compute-0 podman[219995]: 2025-11-25 19:40:40.819678229 +0000 UTC m=+0.080337036 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:40:41 compute-0 nova_compute[187212]: 2025-11-25 19:40:41.096 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:40:43 compute-0 nova_compute[187212]: 2025-11-25 19:40:43.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:43 compute-0 nova_compute[187212]: 2025-11-25 19:40:43.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:40:45 compute-0 nova_compute[187212]: 2025-11-25 19:40:45.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:46 compute-0 nova_compute[187212]: 2025-11-25 19:40:46.099 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:40:47 compute-0 nova_compute[187212]: 2025-11-25 19:40:47.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:47 compute-0 nova_compute[187212]: 2025-11-25 19:40:47.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:40:47 compute-0 nova_compute[187212]: 2025-11-25 19:40:47.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:40:47 compute-0 nova_compute[187212]: 2025-11-25 19:40:47.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:40:47 compute-0 nova_compute[187212]: 2025-11-25 19:40:47.692 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:40:48 compute-0 podman[220020]: 2025-11-25 19:40:48.226939986 +0000 UTC m=+0.149603498 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 25 19:40:48 compute-0 nova_compute[187212]: 2025-11-25 19:40:48.761 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:40:48 compute-0 nova_compute[187212]: 2025-11-25 19:40:48.833 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:40:48 compute-0 nova_compute[187212]: 2025-11-25 19:40:48.834 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:40:48 compute-0 nova_compute[187212]: 2025-11-25 19:40:48.918 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:40:48 compute-0 nova_compute[187212]: 2025-11-25 19:40:48.925 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:40:49 compute-0 nova_compute[187212]: 2025-11-25 19:40:49.012 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:40:49 compute-0 nova_compute[187212]: 2025-11-25 19:40:49.013 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:40:49 compute-0 nova_compute[187212]: 2025-11-25 19:40:49.077 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:40:49 compute-0 nova_compute[187212]: 2025-11-25 19:40:49.266 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:40:49 compute-0 nova_compute[187212]: 2025-11-25 19:40:49.268 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:40:49 compute-0 nova_compute[187212]: 2025-11-25 19:40:49.308 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:40:49 compute-0 nova_compute[187212]: 2025-11-25 19:40:49.309 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5373MB free_disk=72.93511962890625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:40:49 compute-0 nova_compute[187212]: 2025-11-25 19:40:49.309 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:40:49 compute-0 nova_compute[187212]: 2025-11-25 19:40:49.310 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.100 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.289 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.289 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 7b272b07-af4e-48a9-982b-25888fa2f334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.290 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.290 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:40:49 up  1:33,  0 user,  load average: 0.10, 0.14, 0.24\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3407615aeb074089a7b15fbc9f4e9578': '1', 'io_workload': '0', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.346 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.421 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.422 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.437 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.460 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:40:51 compute-0 nova_compute[187212]: 2025-11-25 19:40:51.533 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:40:52 compute-0 nova_compute[187212]: 2025-11-25 19:40:52.043 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:40:52 compute-0 sshd-session[220059]: Invalid user backup from 209.38.103.174 port 56070
Nov 25 19:40:52 compute-0 podman[220061]: 2025-11-25 19:40:52.30492565 +0000 UTC m=+0.055482958 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:40:52 compute-0 sshd-session[220059]: Connection closed by invalid user backup 209.38.103.174 port 56070 [preauth]
Nov 25 19:40:52 compute-0 nova_compute[187212]: 2025-11-25 19:40:52.552 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:40:52 compute-0 nova_compute[187212]: 2025-11-25 19:40:52.552 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.243s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:40:53 compute-0 nova_compute[187212]: 2025-11-25 19:40:53.548 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:56 compute-0 nova_compute[187212]: 2025-11-25 19:40:56.103 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:40:56 compute-0 nova_compute[187212]: 2025-11-25 19:40:56.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:57 compute-0 podman[220080]: 2025-11-25 19:40:57.176012547 +0000 UTC m=+0.096195405 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, release=1755695350, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, managed_by=edpm_ansible)
Nov 25 19:40:58 compute-0 nova_compute[187212]: 2025-11-25 19:40:58.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:58 compute-0 nova_compute[187212]: 2025-11-25 19:40:58.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:40:59 compute-0 podman[197585]: time="2025-11-25T19:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:40:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19522 "" "Go-http-client/1.1"
Nov 25 19:40:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3555 "" "Go-http-client/1.1"
Nov 25 19:41:00 compute-0 podman[220101]: 2025-11-25 19:41:00.162958019 +0000 UTC m=+0.084230979 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 19:41:01 compute-0 nova_compute[187212]: 2025-11-25 19:41:01.106 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:01 compute-0 openstack_network_exporter[199731]: ERROR   19:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:41:01 compute-0 openstack_network_exporter[199731]: ERROR   19:41:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:41:01 compute-0 openstack_network_exporter[199731]: ERROR   19:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:41:01 compute-0 openstack_network_exporter[199731]: ERROR   19:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:41:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:41:01 compute-0 openstack_network_exporter[199731]: ERROR   19:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:41:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:41:06 compute-0 nova_compute[187212]: 2025-11-25 19:41:06.108 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:11 compute-0 nova_compute[187212]: 2025-11-25 19:41:11.111 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:11 compute-0 podman[220122]: 2025-11-25 19:41:11.15818395 +0000 UTC m=+0.078558397 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:41:16 compute-0 nova_compute[187212]: 2025-11-25 19:41:16.113 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:41:16 compute-0 nova_compute[187212]: 2025-11-25 19:41:16.114 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:16 compute-0 nova_compute[187212]: 2025-11-25 19:41:16.115 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:41:16 compute-0 nova_compute[187212]: 2025-11-25 19:41:16.115 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:41:16 compute-0 nova_compute[187212]: 2025-11-25 19:41:16.115 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:41:16 compute-0 nova_compute[187212]: 2025-11-25 19:41:16.117 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:19 compute-0 podman[220147]: 2025-11-25 19:41:19.222537073 +0000 UTC m=+0.141806811 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_controller)
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.232 187216 DEBUG oslo_concurrency.lockutils [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "7b272b07-af4e-48a9-982b-25888fa2f334" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.233 187216 DEBUG oslo_concurrency.lockutils [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.234 187216 DEBUG oslo_concurrency.lockutils [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.234 187216 DEBUG oslo_concurrency.lockutils [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.234 187216 DEBUG oslo_concurrency.lockutils [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.256 187216 INFO nova.compute.manager [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Terminating instance
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.783 187216 DEBUG nova.compute.manager [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Nov 25 19:41:20 compute-0 kernel: tap55822546-3c (unregistering): left promiscuous mode
Nov 25 19:41:20 compute-0 NetworkManager[55552]: <info>  [1764099680.8150] device (tap55822546-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 25 19:41:20 compute-0 ovn_controller[95465]: 2025-11-25T19:41:20Z|00174|binding|INFO|Releasing lport 55822546-3c29-431d-b662-c78aac24c194 from this chassis (sb_readonly=0)
Nov 25 19:41:20 compute-0 ovn_controller[95465]: 2025-11-25T19:41:20Z|00175|binding|INFO|Setting lport 55822546-3c29-431d-b662-c78aac24c194 down in Southbound
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.824 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:20 compute-0 ovn_controller[95465]: 2025-11-25T19:41:20Z|00176|binding|INFO|Removing iface tap55822546-3c ovn-installed in OVS
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.827 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:20.835 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:93:6a 10.100.0.8'], port_security=['fa:16:3e:fb:93:6a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7b272b07-af4e-48a9-982b-25888fa2f334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3407615aeb074089a7b15fbc9f4e9578', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9f3e9396-3e8f-49e3-83aa-c3050a6612b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9e09c55-8331-4266-b41a-8ad7cac362a3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>], logical_port=55822546-3c29-431d-b662-c78aac24c194) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7faa7269a540>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:41:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:20.837 104356 INFO neutron.agent.ovn.metadata.agent [-] Port 55822546-3c29-431d-b662-c78aac24c194 in datapath 0eeee5bd-f568-4881-a684-2e2dd854c2e8 unbound from our chassis
Nov 25 19:41:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:20.839 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0eeee5bd-f568-4881-a684-2e2dd854c2e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:41:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:20.840 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[61b007e1-f405-4f7b-9df3-e0410a1cda18]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:41:20 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:20.840 104356 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8 namespace which is not needed anymore
Nov 25 19:41:20 compute-0 nova_compute[187212]: 2025-11-25 19:41:20.847 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:20 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Deactivated successfully.
Nov 25 19:41:20 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Consumed 45.320s CPU time.
Nov 25 19:41:20 compute-0 systemd-machined[153494]: Machine qemu-16-instance-00000015 terminated.
Nov 25 19:41:21 compute-0 neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8[217648]: [NOTICE]   (217652) : haproxy version is 3.0.5-8e879a5
Nov 25 19:41:21 compute-0 podman[220198]: 2025-11-25 19:41:21.010920368 +0000 UTC m=+0.051081852 container kill a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.010 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8[217648]: [NOTICE]   (217652) : path to executable is /usr/sbin/haproxy
Nov 25 19:41:21 compute-0 neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8[217648]: [WARNING]  (217652) : Exiting Master process...
Nov 25 19:41:21 compute-0 neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8[217648]: [ALERT]    (217652) : Current worker (217654) exited with code 143 (Terminated)
Nov 25 19:41:21 compute-0 neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8[217648]: [WARNING]  (217652) : All workers exited. Exiting... (0)
Nov 25 19:41:21 compute-0 systemd[1]: libpod-a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95.scope: Deactivated successfully.
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.022 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.069 187216 INFO nova.virt.libvirt.driver [-] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Instance destroyed successfully.
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.071 187216 DEBUG nova.objects.instance [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lazy-loading 'resources' on Instance uuid 7b272b07-af4e-48a9-982b-25888fa2f334 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Nov 25 19:41:21 compute-0 podman[220218]: 2025-11-25 19:41:21.07869007 +0000 UTC m=+0.036961828 container died a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest)
Nov 25 19:41:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95-userdata-shm.mount: Deactivated successfully.
Nov 25 19:41:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-0a685e041299f65dd203d0f9ede0dc6483fb88a5a4a7a56d0af62d8e24fe167b-merged.mount: Deactivated successfully.
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.116 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.117 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 podman[220218]: 2025-11-25 19:41:21.119343095 +0000 UTC m=+0.077614833 container cleanup a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:41:21 compute-0 systemd[1]: libpod-conmon-a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95.scope: Deactivated successfully.
Nov 25 19:41:21 compute-0 podman[220224]: 2025-11-25 19:41:21.144012858 +0000 UTC m=+0.098036234 container remove a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95 (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.150 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[54f74a2e-6e1f-4ce6-95fc-7ef5c651e49d]: (4, ("Tue Nov 25 07:41:20 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8 (a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95)\na37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95\nTue Nov 25 07:41:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8 (a37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95)\na37681e04c4ed267440d6f50be15cfbaa7c2ae73e7440bf74677ba659467fd95\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.151 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[c878c8ee-419d-4ce3-87a7-d57d453e57e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.152 104356 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0eeee5bd-f568-4881-a684-2e2dd854c2e8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.152 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[24ab4c13-f6a0-4ed8-90aa-de364c37a7cc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.153 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0eeee5bd-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:41:21 compute-0 kernel: tap0eeee5bd-f0: left promiscuous mode
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.155 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.169 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.171 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[11463f04-0523-4fe8-9336-2280d5459296]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.180 187216 DEBUG nova.compute.manager [req-3e558314-4433-461f-8b50-fac92f57feff req-4f655d38-6f2a-4fbc-ba5b-660cd18b742f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Received event network-vif-unplugged-55822546-3c29-431d-b662-c78aac24c194 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.180 187216 DEBUG oslo_concurrency.lockutils [req-3e558314-4433-461f-8b50-fac92f57feff req-4f655d38-6f2a-4fbc-ba5b-660cd18b742f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.180 187216 DEBUG oslo_concurrency.lockutils [req-3e558314-4433-461f-8b50-fac92f57feff req-4f655d38-6f2a-4fbc-ba5b-660cd18b742f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.181 187216 DEBUG oslo_concurrency.lockutils [req-3e558314-4433-461f-8b50-fac92f57feff req-4f655d38-6f2a-4fbc-ba5b-660cd18b742f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.181 187216 DEBUG nova.compute.manager [req-3e558314-4433-461f-8b50-fac92f57feff req-4f655d38-6f2a-4fbc-ba5b-660cd18b742f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] No waiting events found dispatching network-vif-unplugged-55822546-3c29-431d-b662-c78aac24c194 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.181 187216 DEBUG nova.compute.manager [req-3e558314-4433-461f-8b50-fac92f57feff req-4f655d38-6f2a-4fbc-ba5b-660cd18b742f 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Received event network-vif-unplugged-55822546-3c29-431d-b662-c78aac24c194 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.189 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[4e945209-b394-470d-98fa-3cc0adb39254]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.190 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[08a411ba-b04a-45e8-881b-ed173aead3ec]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.209 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[4d71da42-bcf5-434c-914f-2788a39f71a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500037, 'reachable_time': 22577, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220262, 'error': None, 'target': 'ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.216 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.216 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.216 104475 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0eeee5bd-f568-4881-a684-2e2dd854c2e8 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.217 104475 DEBUG oslo.privsep.daemon [-] privsep: reply[ab484cc9-6c45-41d6-864a-a07f78ee2304]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:41:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d0eeee5bd\x2df568\x2d4881\x2da684\x2d2e2dd854c2e8.mount: Deactivated successfully.
Nov 25 19:41:21 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:21.219 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.588 187216 DEBUG nova.virt.libvirt.vif [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-11-25T19:30:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2100065439',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2100065439',id=21,image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-25T19:30:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3407615aeb074089a7b15fbc9f4e9578',ramdisk_id='',reservation_id='r-z7sedpdx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,admin,member',image_base_image_ref='5ca774a8-6150-424f-aaca-03ab3a3ee8cf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1869045165-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-25T19:30:54Z,user_data=None,user_id='7e1e9cf32ad84b49a76e6a2fc6fe1c70',uuid=7b272b07-af4e-48a9-982b-25888fa2f334,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.589 187216 DEBUG nova.network.os_vif_util [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Converting VIF {"id": "55822546-3c29-431d-b662-c78aac24c194", "address": "fa:16:3e:fb:93:6a", "network": {"id": "0eeee5bd-f568-4881-a684-2e2dd854c2e8", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1376994829-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9cd5f83030a746feb58b69fd4437cb54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55822546-3c", "ovs_interfaceid": "55822546-3c29-431d-b662-c78aac24c194", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.590 187216 DEBUG nova.network.os_vif_util [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:93:6a,bridge_name='br-int',has_traffic_filtering=True,id=55822546-3c29-431d-b662-c78aac24c194,network=Network(0eeee5bd-f568-4881-a684-2e2dd854c2e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55822546-3c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.591 187216 DEBUG os_vif [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:93:6a,bridge_name='br-int',has_traffic_filtering=True,id=55822546-3c29-431d-b662-c78aac24c194,network=Network(0eeee5bd-f568-4881-a684-2e2dd854c2e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55822546-3c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.594 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.595 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55822546-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.596 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.598 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.600 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.601 187216 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=800c22cc-db78-4f38-bf12-116cfa0f93d2) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.603 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.608 187216 INFO os_vif [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:93:6a,bridge_name='br-int',has_traffic_filtering=True,id=55822546-3c29-431d-b662-c78aac24c194,network=Network(0eeee5bd-f568-4881-a684-2e2dd854c2e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55822546-3c')
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.609 187216 INFO nova.virt.libvirt.driver [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Deleting instance files /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334_del
Nov 25 19:41:21 compute-0 nova_compute[187212]: 2025-11-25 19:41:21.610 187216 INFO nova.virt.libvirt.driver [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Deletion of /var/lib/nova/instances/7b272b07-af4e-48a9-982b-25888fa2f334_del complete
Nov 25 19:41:22 compute-0 nova_compute[187212]: 2025-11-25 19:41:22.125 187216 INFO nova.compute.manager [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Took 1.34 seconds to destroy the instance on the hypervisor.
Nov 25 19:41:22 compute-0 nova_compute[187212]: 2025-11-25 19:41:22.126 187216 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Nov 25 19:41:22 compute-0 nova_compute[187212]: 2025-11-25 19:41:22.126 187216 DEBUG nova.compute.manager [-] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Nov 25 19:41:22 compute-0 nova_compute[187212]: 2025-11-25 19:41:22.126 187216 DEBUG nova.network.neutron [-] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Nov 25 19:41:22 compute-0 nova_compute[187212]: 2025-11-25 19:41:22.126 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:41:22 compute-0 nova_compute[187212]: 2025-11-25 19:41:22.390 187216 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Nov 25 19:41:22 compute-0 nova_compute[187212]: 2025-11-25 19:41:22.713 187216 DEBUG nova.compute.manager [req-a8dae0ee-b6a0-44b2-a704-cdf7eac3fdfb req-850d26d3-b5b3-4d4b-b828-be0085ccaad0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Received event network-vif-deleted-55822546-3c29-431d-b662-c78aac24c194 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:41:22 compute-0 nova_compute[187212]: 2025-11-25 19:41:22.714 187216 INFO nova.compute.manager [req-a8dae0ee-b6a0-44b2-a704-cdf7eac3fdfb req-850d26d3-b5b3-4d4b-b828-be0085ccaad0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Neutron deleted interface 55822546-3c29-431d-b662-c78aac24c194; detaching it from the instance and deleting it from the info cache
Nov 25 19:41:22 compute-0 nova_compute[187212]: 2025-11-25 19:41:22.714 187216 DEBUG nova.network.neutron [req-a8dae0ee-b6a0-44b2-a704-cdf7eac3fdfb req-850d26d3-b5b3-4d4b-b828-be0085ccaad0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:41:23 compute-0 nova_compute[187212]: 2025-11-25 19:41:23.149 187216 DEBUG nova.network.neutron [-] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Nov 25 19:41:23 compute-0 podman[220264]: 2025-11-25 19:41:23.173848027 +0000 UTC m=+0.081682621 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:41:23 compute-0 nova_compute[187212]: 2025-11-25 19:41:23.223 187216 DEBUG nova.compute.manager [req-a8dae0ee-b6a0-44b2-a704-cdf7eac3fdfb req-850d26d3-b5b3-4d4b-b828-be0085ccaad0 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Detach interface failed, port_id=55822546-3c29-431d-b662-c78aac24c194, reason: Instance 7b272b07-af4e-48a9-982b-25888fa2f334 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Nov 25 19:41:23 compute-0 nova_compute[187212]: 2025-11-25 19:41:23.236 187216 DEBUG nova.compute.manager [req-625e3789-c1c6-4d26-8dd4-a9b51d89322b req-c8e0812f-2d16-42e8-a7ab-8b65358b503d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Received event network-vif-unplugged-55822546-3c29-431d-b662-c78aac24c194 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Nov 25 19:41:23 compute-0 nova_compute[187212]: 2025-11-25 19:41:23.237 187216 DEBUG oslo_concurrency.lockutils [req-625e3789-c1c6-4d26-8dd4-a9b51d89322b req-c8e0812f-2d16-42e8-a7ab-8b65358b503d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Acquiring lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:41:23 compute-0 nova_compute[187212]: 2025-11-25 19:41:23.237 187216 DEBUG oslo_concurrency.lockutils [req-625e3789-c1c6-4d26-8dd4-a9b51d89322b req-c8e0812f-2d16-42e8-a7ab-8b65358b503d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:41:23 compute-0 nova_compute[187212]: 2025-11-25 19:41:23.237 187216 DEBUG oslo_concurrency.lockutils [req-625e3789-c1c6-4d26-8dd4-a9b51d89322b req-c8e0812f-2d16-42e8-a7ab-8b65358b503d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:41:23 compute-0 nova_compute[187212]: 2025-11-25 19:41:23.238 187216 DEBUG nova.compute.manager [req-625e3789-c1c6-4d26-8dd4-a9b51d89322b req-c8e0812f-2d16-42e8-a7ab-8b65358b503d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] No waiting events found dispatching network-vif-unplugged-55822546-3c29-431d-b662-c78aac24c194 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Nov 25 19:41:23 compute-0 nova_compute[187212]: 2025-11-25 19:41:23.238 187216 DEBUG nova.compute.manager [req-625e3789-c1c6-4d26-8dd4-a9b51d89322b req-c8e0812f-2d16-42e8-a7ab-8b65358b503d 194945b264a544cfb7282237be66b1bc 105681f01c6e4422bb6b864118579069 - - default default] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Received event network-vif-unplugged-55822546-3c29-431d-b662-c78aac24c194 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Nov 25 19:41:23 compute-0 nova_compute[187212]: 2025-11-25 19:41:23.655 187216 INFO nova.compute.manager [-] [instance: 7b272b07-af4e-48a9-982b-25888fa2f334] Took 1.53 seconds to deallocate network for instance.
Nov 25 19:41:24 compute-0 nova_compute[187212]: 2025-11-25 19:41:24.179 187216 DEBUG oslo_concurrency.lockutils [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:41:24 compute-0 nova_compute[187212]: 2025-11-25 19:41:24.180 187216 DEBUG oslo_concurrency.lockutils [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:41:24 compute-0 nova_compute[187212]: 2025-11-25 19:41:24.243 187216 DEBUG nova.compute.provider_tree [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:41:24 compute-0 nova_compute[187212]: 2025-11-25 19:41:24.751 187216 DEBUG nova.scheduler.client.report [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:41:25 compute-0 nova_compute[187212]: 2025-11-25 19:41:25.262 187216 DEBUG oslo_concurrency.lockutils [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:41:25 compute-0 nova_compute[187212]: 2025-11-25 19:41:25.298 187216 INFO nova.scheduler.client.report [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Deleted allocations for instance 7b272b07-af4e-48a9-982b-25888fa2f334
Nov 25 19:41:26 compute-0 nova_compute[187212]: 2025-11-25 19:41:26.118 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:26 compute-0 nova_compute[187212]: 2025-11-25 19:41:26.334 187216 DEBUG oslo_concurrency.lockutils [None req-f0c4b4ce-2a77-4ea4-bd4b-3ef9d692efa9 7e1e9cf32ad84b49a76e6a2fc6fe1c70 3407615aeb074089a7b15fbc9f4e9578 - - default default] Lock "7b272b07-af4e-48a9-982b-25888fa2f334" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:41:26 compute-0 nova_compute[187212]: 2025-11-25 19:41:26.603 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:27 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:27.220 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:41:28 compute-0 podman[220285]: 2025-11-25 19:41:28.15283921 +0000 UTC m=+0.071062528 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:41:29 compute-0 podman[197585]: time="2025-11-25T19:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:41:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:41:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3086 "" "Go-http-client/1.1"
Nov 25 19:41:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:31.149 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:41:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:31.149 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:41:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:41:31.150 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:41:31 compute-0 nova_compute[187212]: 2025-11-25 19:41:31.158 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:31 compute-0 podman[220307]: 2025-11-25 19:41:31.190005018 +0000 UTC m=+0.113351106 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:41:31 compute-0 openstack_network_exporter[199731]: ERROR   19:41:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:41:31 compute-0 openstack_network_exporter[199731]: ERROR   19:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:41:31 compute-0 openstack_network_exporter[199731]: ERROR   19:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:41:31 compute-0 openstack_network_exporter[199731]: ERROR   19:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:41:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:41:31 compute-0 openstack_network_exporter[199731]: ERROR   19:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:41:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:41:31 compute-0 nova_compute[187212]: 2025-11-25 19:41:31.605 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:32 compute-0 sshd-session[220330]: Invalid user backup from 209.38.103.174 port 43266
Nov 25 19:41:33 compute-0 sshd-session[220330]: Connection closed by invalid user backup 209.38.103.174 port 43266 [preauth]
Nov 25 19:41:36 compute-0 nova_compute[187212]: 2025-11-25 19:41:36.162 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:36 compute-0 nova_compute[187212]: 2025-11-25 19:41:36.608 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:41 compute-0 nova_compute[187212]: 2025-11-25 19:41:41.164 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:41 compute-0 nova_compute[187212]: 2025-11-25 19:41:41.610 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:42 compute-0 nova_compute[187212]: 2025-11-25 19:41:42.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:41:42 compute-0 podman[220332]: 2025-11-25 19:41:42.181361399 +0000 UTC m=+0.096010067 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:41:45 compute-0 nova_compute[187212]: 2025-11-25 19:41:45.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:41:45 compute-0 nova_compute[187212]: 2025-11-25 19:41:45.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:41:46 compute-0 nova_compute[187212]: 2025-11-25 19:41:46.211 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:46 compute-0 nova_compute[187212]: 2025-11-25 19:41:46.612 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:47 compute-0 nova_compute[187212]: 2025-11-25 19:41:47.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:41:48 compute-0 nova_compute[187212]: 2025-11-25 19:41:48.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:41:48 compute-0 nova_compute[187212]: 2025-11-25 19:41:48.705 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:41:48 compute-0 nova_compute[187212]: 2025-11-25 19:41:48.706 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:41:48 compute-0 nova_compute[187212]: 2025-11-25 19:41:48.706 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:41:48 compute-0 nova_compute[187212]: 2025-11-25 19:41:48.706 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:41:49 compute-0 nova_compute[187212]: 2025-11-25 19:41:49.758 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:41:49 compute-0 nova_compute[187212]: 2025-11-25 19:41:49.850 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:41:49 compute-0 nova_compute[187212]: 2025-11-25 19:41:49.852 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:41:49 compute-0 nova_compute[187212]: 2025-11-25 19:41:49.921 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:41:50 compute-0 podman[220364]: 2025-11-25 19:41:50.407311293 +0000 UTC m=+0.114806974 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 19:41:50 compute-0 nova_compute[187212]: 2025-11-25 19:41:50.445 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:41:50 compute-0 nova_compute[187212]: 2025-11-25 19:41:50.447 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:41:50 compute-0 nova_compute[187212]: 2025-11-25 19:41:50.469 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:41:50 compute-0 nova_compute[187212]: 2025-11-25 19:41:50.470 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5595MB free_disk=72.96364212036133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:41:50 compute-0 nova_compute[187212]: 2025-11-25 19:41:50.470 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:41:50 compute-0 nova_compute[187212]: 2025-11-25 19:41:50.470 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:41:51 compute-0 nova_compute[187212]: 2025-11-25 19:41:51.212 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:51 compute-0 nova_compute[187212]: 2025-11-25 19:41:51.617 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:52 compute-0 nova_compute[187212]: 2025-11-25 19:41:52.038 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:41:52 compute-0 nova_compute[187212]: 2025-11-25 19:41:52.039 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:41:52 compute-0 nova_compute[187212]: 2025-11-25 19:41:52.039 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:41:50 up  1:34,  0 user,  load average: 0.23, 0.16, 0.24\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:41:52 compute-0 nova_compute[187212]: 2025-11-25 19:41:52.083 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:41:52 compute-0 nova_compute[187212]: 2025-11-25 19:41:52.592 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:41:53 compute-0 nova_compute[187212]: 2025-11-25 19:41:53.104 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:41:53 compute-0 nova_compute[187212]: 2025-11-25 19:41:53.105 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.634s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:41:54 compute-0 nova_compute[187212]: 2025-11-25 19:41:54.100 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:41:54 compute-0 nova_compute[187212]: 2025-11-25 19:41:54.100 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:41:54 compute-0 podman[220390]: 2025-11-25 19:41:54.176871405 +0000 UTC m=+0.096180651 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 25 19:41:56 compute-0 nova_compute[187212]: 2025-11-25 19:41:56.257 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:56 compute-0 nova_compute[187212]: 2025-11-25 19:41:56.624 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:41:57 compute-0 nova_compute[187212]: 2025-11-25 19:41:57.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:41:58 compute-0 nova_compute[187212]: 2025-11-25 19:41:58.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:41:58 compute-0 nova_compute[187212]: 2025-11-25 19:41:58.176 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:41:59 compute-0 podman[220409]: 2025-11-25 19:41:59.20216098 +0000 UTC m=+0.122004254 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Nov 25 19:41:59 compute-0 podman[197585]: time="2025-11-25T19:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:41:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:41:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3086 "" "Go-http-client/1.1"
Nov 25 19:42:00 compute-0 ovn_controller[95465]: 2025-11-25T19:42:00Z|00177|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Nov 25 19:42:01 compute-0 nova_compute[187212]: 2025-11-25 19:42:01.258 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:01 compute-0 openstack_network_exporter[199731]: ERROR   19:42:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:42:01 compute-0 openstack_network_exporter[199731]: ERROR   19:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:42:01 compute-0 openstack_network_exporter[199731]: ERROR   19:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:42:01 compute-0 openstack_network_exporter[199731]: ERROR   19:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:42:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:42:01 compute-0 openstack_network_exporter[199731]: ERROR   19:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:42:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:42:01 compute-0 nova_compute[187212]: 2025-11-25 19:42:01.627 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:02 compute-0 podman[220430]: 2025-11-25 19:42:02.144928234 +0000 UTC m=+0.062912753 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:42:06 compute-0 nova_compute[187212]: 2025-11-25 19:42:06.262 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:06 compute-0 nova_compute[187212]: 2025-11-25 19:42:06.629 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:11 compute-0 nova_compute[187212]: 2025-11-25 19:42:11.264 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:11 compute-0 nova_compute[187212]: 2025-11-25 19:42:11.630 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:13 compute-0 sshd-session[220451]: Invalid user backup from 209.38.103.174 port 34866
Nov 25 19:42:13 compute-0 podman[220453]: 2025-11-25 19:42:13.159469732 +0000 UTC m=+0.082685745 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:42:13 compute-0 sshd-session[220451]: Connection closed by invalid user backup 209.38.103.174 port 34866 [preauth]
Nov 25 19:42:16 compute-0 nova_compute[187212]: 2025-11-25 19:42:16.267 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:16 compute-0 nova_compute[187212]: 2025-11-25 19:42:16.633 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:21 compute-0 podman[220478]: 2025-11-25 19:42:21.186704025 +0000 UTC m=+0.107672945 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 19:42:21 compute-0 nova_compute[187212]: 2025-11-25 19:42:21.307 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:21 compute-0 nova_compute[187212]: 2025-11-25 19:42:21.636 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:25 compute-0 podman[220506]: 2025-11-25 19:42:25.169004942 +0000 UTC m=+0.088535350 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:42:26 compute-0 nova_compute[187212]: 2025-11-25 19:42:26.310 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:26 compute-0 nova_compute[187212]: 2025-11-25 19:42:26.638 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:29 compute-0 podman[197585]: time="2025-11-25T19:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:42:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:42:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3089 "" "Go-http-client/1.1"
Nov 25 19:42:30 compute-0 podman[220526]: 2025-11-25 19:42:30.189950212 +0000 UTC m=+0.097289431 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Nov 25 19:42:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:42:31.151 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:42:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:42:31.152 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:42:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:42:31.152 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:42:31 compute-0 nova_compute[187212]: 2025-11-25 19:42:31.312 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:31 compute-0 openstack_network_exporter[199731]: ERROR   19:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:42:31 compute-0 openstack_network_exporter[199731]: ERROR   19:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:42:31 compute-0 openstack_network_exporter[199731]: ERROR   19:42:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:42:31 compute-0 openstack_network_exporter[199731]: ERROR   19:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:42:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:42:31 compute-0 openstack_network_exporter[199731]: ERROR   19:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:42:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:42:31 compute-0 nova_compute[187212]: 2025-11-25 19:42:31.640 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:33 compute-0 podman[220549]: 2025-11-25 19:42:33.168470503 +0000 UTC m=+0.078490845 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 19:42:36 compute-0 nova_compute[187212]: 2025-11-25 19:42:36.315 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:36 compute-0 nova_compute[187212]: 2025-11-25 19:42:36.641 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:41 compute-0 nova_compute[187212]: 2025-11-25 19:42:41.317 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:41 compute-0 nova_compute[187212]: 2025-11-25 19:42:41.644 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:42 compute-0 nova_compute[187212]: 2025-11-25 19:42:42.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:42:44 compute-0 podman[220569]: 2025-11-25 19:42:44.168540086 +0000 UTC m=+0.081552975 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:42:46 compute-0 nova_compute[187212]: 2025-11-25 19:42:46.321 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:46 compute-0 nova_compute[187212]: 2025-11-25 19:42:46.649 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:47 compute-0 nova_compute[187212]: 2025-11-25 19:42:47.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:42:47 compute-0 nova_compute[187212]: 2025-11-25 19:42:47.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:42:48 compute-0 nova_compute[187212]: 2025-11-25 19:42:48.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:42:49 compute-0 nova_compute[187212]: 2025-11-25 19:42:49.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:42:50 compute-0 nova_compute[187212]: 2025-11-25 19:42:50.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:42:50 compute-0 nova_compute[187212]: 2025-11-25 19:42:50.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:42:50 compute-0 nova_compute[187212]: 2025-11-25 19:42:50.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:42:50 compute-0 nova_compute[187212]: 2025-11-25 19:42:50.695 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:42:50 compute-0 nova_compute[187212]: 2025-11-25 19:42:50.695 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:42:51 compute-0 nova_compute[187212]: 2025-11-25 19:42:51.323 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:51 compute-0 nova_compute[187212]: 2025-11-25 19:42:51.652 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:51 compute-0 nova_compute[187212]: 2025-11-25 19:42:51.741 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:42:51 compute-0 nova_compute[187212]: 2025-11-25 19:42:51.799 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:42:51 compute-0 nova_compute[187212]: 2025-11-25 19:42:51.801 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:42:51 compute-0 nova_compute[187212]: 2025-11-25 19:42:51.864 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:42:52 compute-0 nova_compute[187212]: 2025-11-25 19:42:52.052 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:42:52 compute-0 nova_compute[187212]: 2025-11-25 19:42:52.054 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:42:52 compute-0 nova_compute[187212]: 2025-11-25 19:42:52.089 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:42:52 compute-0 nova_compute[187212]: 2025-11-25 19:42:52.090 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5592MB free_disk=72.96124649047852GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:42:52 compute-0 nova_compute[187212]: 2025-11-25 19:42:52.091 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:42:52 compute-0 nova_compute[187212]: 2025-11-25 19:42:52.091 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:42:52 compute-0 podman[220599]: 2025-11-25 19:42:52.212792589 +0000 UTC m=+0.124617623 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 25 19:42:53 compute-0 nova_compute[187212]: 2025-11-25 19:42:53.655 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:42:53 compute-0 nova_compute[187212]: 2025-11-25 19:42:53.656 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:42:53 compute-0 nova_compute[187212]: 2025-11-25 19:42:53.656 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:42:52 up  1:35,  0 user,  load average: 0.08, 0.13, 0.22\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:42:53 compute-0 nova_compute[187212]: 2025-11-25 19:42:53.713 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:42:54 compute-0 nova_compute[187212]: 2025-11-25 19:42:54.229 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:42:54 compute-0 sshd-session[220628]: Invalid user backup from 209.38.103.174 port 43368
Nov 25 19:42:54 compute-0 sshd-session[220628]: Connection closed by invalid user backup 209.38.103.174 port 43368 [preauth]
Nov 25 19:42:54 compute-0 nova_compute[187212]: 2025-11-25 19:42:54.741 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:42:54 compute-0 nova_compute[187212]: 2025-11-25 19:42:54.741 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.651s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:42:56 compute-0 podman[220630]: 2025-11-25 19:42:56.161440466 +0000 UTC m=+0.072956669 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 19:42:56 compute-0 nova_compute[187212]: 2025-11-25 19:42:56.327 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:56 compute-0 nova_compute[187212]: 2025-11-25 19:42:56.654 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:42:59 compute-0 podman[197585]: time="2025-11-25T19:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:42:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:42:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3087 "" "Go-http-client/1.1"
Nov 25 19:43:01 compute-0 podman[220649]: 2025-11-25 19:43:01.177004055 +0000 UTC m=+0.092660018 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 25 19:43:01 compute-0 nova_compute[187212]: 2025-11-25 19:43:01.329 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:01 compute-0 openstack_network_exporter[199731]: ERROR   19:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:43:01 compute-0 openstack_network_exporter[199731]: ERROR   19:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:43:01 compute-0 openstack_network_exporter[199731]: ERROR   19:43:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:43:01 compute-0 openstack_network_exporter[199731]: ERROR   19:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:43:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:43:01 compute-0 openstack_network_exporter[199731]: ERROR   19:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:43:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:43:01 compute-0 nova_compute[187212]: 2025-11-25 19:43:01.681 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:02 compute-0 nova_compute[187212]: 2025-11-25 19:43:02.741 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:43:02 compute-0 nova_compute[187212]: 2025-11-25 19:43:02.741 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:43:02 compute-0 nova_compute[187212]: 2025-11-25 19:43:02.741 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:43:04 compute-0 podman[220670]: 2025-11-25 19:43:04.154829076 +0000 UTC m=+0.079633805 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4)
Nov 25 19:43:06 compute-0 nova_compute[187212]: 2025-11-25 19:43:06.330 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:06 compute-0 nova_compute[187212]: 2025-11-25 19:43:06.683 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:11 compute-0 nova_compute[187212]: 2025-11-25 19:43:11.334 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:11 compute-0 nova_compute[187212]: 2025-11-25 19:43:11.685 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:15 compute-0 podman[220691]: 2025-11-25 19:43:15.165108478 +0000 UTC m=+0.081522645 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:43:16 compute-0 nova_compute[187212]: 2025-11-25 19:43:16.335 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:16 compute-0 nova_compute[187212]: 2025-11-25 19:43:16.687 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:21 compute-0 nova_compute[187212]: 2025-11-25 19:43:21.382 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:21 compute-0 nova_compute[187212]: 2025-11-25 19:43:21.700 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:23 compute-0 podman[220715]: 2025-11-25 19:43:23.214545719 +0000 UTC m=+0.134600417 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Nov 25 19:43:26 compute-0 nova_compute[187212]: 2025-11-25 19:43:26.422 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:26 compute-0 nova_compute[187212]: 2025-11-25 19:43:26.703 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:27 compute-0 podman[220743]: 2025-11-25 19:43:27.152905002 +0000 UTC m=+0.072201328 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 25 19:43:29 compute-0 podman[197585]: time="2025-11-25T19:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:43:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:43:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3090 "" "Go-http-client/1.1"
Nov 25 19:43:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:43:31.153 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:43:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:43:31.154 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:43:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:43:31.154 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:43:31 compute-0 openstack_network_exporter[199731]: ERROR   19:43:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:43:31 compute-0 openstack_network_exporter[199731]: ERROR   19:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:43:31 compute-0 openstack_network_exporter[199731]: ERROR   19:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:43:31 compute-0 openstack_network_exporter[199731]: ERROR   19:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:43:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:43:31 compute-0 openstack_network_exporter[199731]: ERROR   19:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:43:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:43:31 compute-0 nova_compute[187212]: 2025-11-25 19:43:31.425 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:31 compute-0 nova_compute[187212]: 2025-11-25 19:43:31.706 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:32 compute-0 podman[220763]: 2025-11-25 19:43:32.179725827 +0000 UTC m=+0.105846747 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, vcs-type=git, version=9.6, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Nov 25 19:43:35 compute-0 podman[220786]: 2025-11-25 19:43:35.182381773 +0000 UTC m=+0.096479770 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 25 19:43:35 compute-0 sshd-session[220784]: Invalid user backup from 209.38.103.174 port 49004
Nov 25 19:43:35 compute-0 sshd-session[220784]: Connection closed by invalid user backup 209.38.103.174 port 49004 [preauth]
Nov 25 19:43:36 compute-0 nova_compute[187212]: 2025-11-25 19:43:36.426 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:36 compute-0 nova_compute[187212]: 2025-11-25 19:43:36.707 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:41 compute-0 nova_compute[187212]: 2025-11-25 19:43:41.477 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:41 compute-0 nova_compute[187212]: 2025-11-25 19:43:41.710 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:44 compute-0 nova_compute[187212]: 2025-11-25 19:43:44.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:43:46 compute-0 podman[220806]: 2025-11-25 19:43:46.146720171 +0000 UTC m=+0.071454519 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:43:46 compute-0 nova_compute[187212]: 2025-11-25 19:43:46.479 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:46 compute-0 nova_compute[187212]: 2025-11-25 19:43:46.712 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:47 compute-0 nova_compute[187212]: 2025-11-25 19:43:47.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:43:47 compute-0 nova_compute[187212]: 2025-11-25 19:43:47.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:43:49 compute-0 nova_compute[187212]: 2025-11-25 19:43:49.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:43:49 compute-0 nova_compute[187212]: 2025-11-25 19:43:49.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:43:51 compute-0 nova_compute[187212]: 2025-11-25 19:43:51.483 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:51 compute-0 nova_compute[187212]: 2025-11-25 19:43:51.714 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:52 compute-0 nova_compute[187212]: 2025-11-25 19:43:52.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:43:52 compute-0 nova_compute[187212]: 2025-11-25 19:43:52.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:43:52 compute-0 nova_compute[187212]: 2025-11-25 19:43:52.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:43:52 compute-0 nova_compute[187212]: 2025-11-25 19:43:52.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:43:52 compute-0 nova_compute[187212]: 2025-11-25 19:43:52.693 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:43:53 compute-0 nova_compute[187212]: 2025-11-25 19:43:53.731 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:43:53 compute-0 nova_compute[187212]: 2025-11-25 19:43:53.782 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:43:53 compute-0 nova_compute[187212]: 2025-11-25 19:43:53.783 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:43:53 compute-0 nova_compute[187212]: 2025-11-25 19:43:53.835 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:43:54 compute-0 nova_compute[187212]: 2025-11-25 19:43:54.028 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:43:54 compute-0 nova_compute[187212]: 2025-11-25 19:43:54.030 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:43:54 compute-0 nova_compute[187212]: 2025-11-25 19:43:54.057 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:43:54 compute-0 nova_compute[187212]: 2025-11-25 19:43:54.058 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5600MB free_disk=72.96126556396484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:43:54 compute-0 nova_compute[187212]: 2025-11-25 19:43:54.059 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:43:54 compute-0 nova_compute[187212]: 2025-11-25 19:43:54.059 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:43:54 compute-0 podman[220837]: 2025-11-25 19:43:54.166068367 +0000 UTC m=+0.096123100 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Nov 25 19:43:55 compute-0 nova_compute[187212]: 2025-11-25 19:43:55.637 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:43:55 compute-0 nova_compute[187212]: 2025-11-25 19:43:55.638 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:43:55 compute-0 nova_compute[187212]: 2025-11-25 19:43:55.638 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:43:54 up  1:36,  0 user,  load average: 0.03, 0.10, 0.20\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:43:55 compute-0 nova_compute[187212]: 2025-11-25 19:43:55.676 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:43:56 compute-0 nova_compute[187212]: 2025-11-25 19:43:56.186 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:43:56 compute-0 nova_compute[187212]: 2025-11-25 19:43:56.537 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:56 compute-0 nova_compute[187212]: 2025-11-25 19:43:56.696 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:43:56 compute-0 nova_compute[187212]: 2025-11-25 19:43:56.697 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.637s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:43:56 compute-0 nova_compute[187212]: 2025-11-25 19:43:56.715 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:43:58 compute-0 podman[220865]: 2025-11-25 19:43:58.170937638 +0000 UTC m=+0.086541887 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 25 19:43:59 compute-0 podman[197585]: time="2025-11-25T19:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:43:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:43:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3088 "" "Go-http-client/1.1"
Nov 25 19:44:01 compute-0 openstack_network_exporter[199731]: ERROR   19:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:44:01 compute-0 openstack_network_exporter[199731]: ERROR   19:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:44:01 compute-0 openstack_network_exporter[199731]: ERROR   19:44:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:44:01 compute-0 openstack_network_exporter[199731]: ERROR   19:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:44:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:44:01 compute-0 openstack_network_exporter[199731]: ERROR   19:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:44:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:44:01 compute-0 nova_compute[187212]: 2025-11-25 19:44:01.541 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:01 compute-0 nova_compute[187212]: 2025-11-25 19:44:01.717 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:02 compute-0 nova_compute[187212]: 2025-11-25 19:44:02.693 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:03 compute-0 podman[220885]: 2025-11-25 19:44:03.142214677 +0000 UTC m=+0.072510646 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 19:44:03 compute-0 nova_compute[187212]: 2025-11-25 19:44:03.205 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:03 compute-0 nova_compute[187212]: 2025-11-25 19:44:03.205 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:03 compute-0 nova_compute[187212]: 2025-11-25 19:44:03.205 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:06 compute-0 podman[220906]: 2025-11-25 19:44:06.168498139 +0000 UTC m=+0.085139720 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:44:06 compute-0 nova_compute[187212]: 2025-11-25 19:44:06.592 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:06 compute-0 nova_compute[187212]: 2025-11-25 19:44:06.718 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:11 compute-0 nova_compute[187212]: 2025-11-25 19:44:11.596 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:11 compute-0 nova_compute[187212]: 2025-11-25 19:44:11.721 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:16 compute-0 nova_compute[187212]: 2025-11-25 19:44:16.633 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:16 compute-0 nova_compute[187212]: 2025-11-25 19:44:16.722 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:17 compute-0 podman[220928]: 2025-11-25 19:44:17.162622505 +0000 UTC m=+0.072688841 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:44:17 compute-0 sshd-session[220926]: Invalid user backup from 209.38.103.174 port 44306
Nov 25 19:44:17 compute-0 sshd-session[220926]: Connection closed by invalid user backup 209.38.103.174 port 44306 [preauth]
Nov 25 19:44:21 compute-0 nova_compute[187212]: 2025-11-25 19:44:21.664 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:21 compute-0 nova_compute[187212]: 2025-11-25 19:44:21.725 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:25 compute-0 podman[220953]: 2025-11-25 19:44:25.192931077 +0000 UTC m=+0.115205834 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:44:26 compute-0 nova_compute[187212]: 2025-11-25 19:44:26.673 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:26 compute-0 nova_compute[187212]: 2025-11-25 19:44:26.726 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:29 compute-0 podman[220980]: 2025-11-25 19:44:29.174450852 +0000 UTC m=+0.089753682 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 19:44:29 compute-0 podman[197585]: time="2025-11-25T19:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:44:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:44:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3088 "" "Go-http-client/1.1"
Nov 25 19:44:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:44:31.155 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:44:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:44:31.155 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:44:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:44:31.156 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:44:31 compute-0 openstack_network_exporter[199731]: ERROR   19:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:44:31 compute-0 openstack_network_exporter[199731]: ERROR   19:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:44:31 compute-0 openstack_network_exporter[199731]: ERROR   19:44:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:44:31 compute-0 openstack_network_exporter[199731]: ERROR   19:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:44:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:44:31 compute-0 openstack_network_exporter[199731]: ERROR   19:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:44:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:44:31 compute-0 nova_compute[187212]: 2025-11-25 19:44:31.675 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:31 compute-0 nova_compute[187212]: 2025-11-25 19:44:31.728 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:34 compute-0 podman[221001]: 2025-11-25 19:44:34.183297084 +0000 UTC m=+0.087658347 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 25 19:44:36 compute-0 nova_compute[187212]: 2025-11-25 19:44:36.729 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:37 compute-0 podman[221023]: 2025-11-25 19:44:37.181314398 +0000 UTC m=+0.097270010 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 25 19:44:41 compute-0 nova_compute[187212]: 2025-11-25 19:44:41.732 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:44:41 compute-0 nova_compute[187212]: 2025-11-25 19:44:41.733 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:44:41 compute-0 nova_compute[187212]: 2025-11-25 19:44:41.733 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:44:41 compute-0 nova_compute[187212]: 2025-11-25 19:44:41.733 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:44:41 compute-0 nova_compute[187212]: 2025-11-25 19:44:41.785 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:41 compute-0 nova_compute[187212]: 2025-11-25 19:44:41.785 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:44:45 compute-0 nova_compute[187212]: 2025-11-25 19:44:45.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:46 compute-0 nova_compute[187212]: 2025-11-25 19:44:46.787 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:44:46 compute-0 nova_compute[187212]: 2025-11-25 19:44:46.788 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:44:46 compute-0 nova_compute[187212]: 2025-11-25 19:44:46.789 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:44:46 compute-0 nova_compute[187212]: 2025-11-25 19:44:46.789 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:44:46 compute-0 nova_compute[187212]: 2025-11-25 19:44:46.812 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:46 compute-0 nova_compute[187212]: 2025-11-25 19:44:46.813 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:44:48 compute-0 podman[221043]: 2025-11-25 19:44:48.17024851 +0000 UTC m=+0.085010297 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:44:49 compute-0 nova_compute[187212]: 2025-11-25 19:44:49.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:49 compute-0 nova_compute[187212]: 2025-11-25 19:44:49.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:49 compute-0 nova_compute[187212]: 2025-11-25 19:44:49.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:44:51 compute-0 nova_compute[187212]: 2025-11-25 19:44:51.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:51 compute-0 nova_compute[187212]: 2025-11-25 19:44:51.814 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:44:51 compute-0 nova_compute[187212]: 2025-11-25 19:44:51.816 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:44:51 compute-0 nova_compute[187212]: 2025-11-25 19:44:51.816 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:44:51 compute-0 nova_compute[187212]: 2025-11-25 19:44:51.817 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:44:51 compute-0 nova_compute[187212]: 2025-11-25 19:44:51.852 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:51 compute-0 nova_compute[187212]: 2025-11-25 19:44:51.853 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:44:54 compute-0 nova_compute[187212]: 2025-11-25 19:44:54.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:54 compute-0 nova_compute[187212]: 2025-11-25 19:44:54.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:44:54 compute-0 nova_compute[187212]: 2025-11-25 19:44:54.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:44:54 compute-0 nova_compute[187212]: 2025-11-25 19:44:54.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:44:54 compute-0 nova_compute[187212]: 2025-11-25 19:44:54.689 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:44:55 compute-0 nova_compute[187212]: 2025-11-25 19:44:55.749 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:44:55 compute-0 nova_compute[187212]: 2025-11-25 19:44:55.801 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:44:55 compute-0 nova_compute[187212]: 2025-11-25 19:44:55.802 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:44:55 compute-0 nova_compute[187212]: 2025-11-25 19:44:55.880 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:44:56 compute-0 nova_compute[187212]: 2025-11-25 19:44:56.117 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:44:56 compute-0 nova_compute[187212]: 2025-11-25 19:44:56.118 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:44:56 compute-0 nova_compute[187212]: 2025-11-25 19:44:56.140 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:44:56 compute-0 nova_compute[187212]: 2025-11-25 19:44:56.141 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5610MB free_disk=72.96124267578125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:44:56 compute-0 nova_compute[187212]: 2025-11-25 19:44:56.141 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:44:56 compute-0 nova_compute[187212]: 2025-11-25 19:44:56.141 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:44:56 compute-0 podman[221074]: 2025-11-25 19:44:56.189757281 +0000 UTC m=+0.103763992 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 19:44:56 compute-0 nova_compute[187212]: 2025-11-25 19:44:56.853 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:44:56 compute-0 nova_compute[187212]: 2025-11-25 19:44:56.855 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:44:57 compute-0 nova_compute[187212]: 2025-11-25 19:44:57.816 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:44:57 compute-0 nova_compute[187212]: 2025-11-25 19:44:57.817 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:44:57 compute-0 nova_compute[187212]: 2025-11-25 19:44:57.817 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:44:56 up  1:37,  0 user,  load average: 0.01, 0.08, 0.18\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:44:57 compute-0 nova_compute[187212]: 2025-11-25 19:44:57.872 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:44:58 compute-0 nova_compute[187212]: 2025-11-25 19:44:58.382 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:44:58 compute-0 sshd-session[221098]: Invalid user backup from 209.38.103.174 port 46458
Nov 25 19:44:58 compute-0 sshd-session[221098]: Connection closed by invalid user backup 209.38.103.174 port 46458 [preauth]
Nov 25 19:44:58 compute-0 nova_compute[187212]: 2025-11-25 19:44:58.893 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:44:58 compute-0 nova_compute[187212]: 2025-11-25 19:44:58.894 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.753s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:44:58 compute-0 nova_compute[187212]: 2025-11-25 19:44:58.894 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:44:58 compute-0 nova_compute[187212]: 2025-11-25 19:44:58.895 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:44:59 compute-0 podman[197585]: time="2025-11-25T19:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:44:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:44:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3093 "" "Go-http-client/1.1"
Nov 25 19:45:00 compute-0 podman[221100]: 2025-11-25 19:45:00.149382578 +0000 UTC m=+0.062085161 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Nov 25 19:45:01 compute-0 openstack_network_exporter[199731]: ERROR   19:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:45:01 compute-0 openstack_network_exporter[199731]: ERROR   19:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:45:01 compute-0 openstack_network_exporter[199731]: ERROR   19:45:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:45:01 compute-0 openstack_network_exporter[199731]: ERROR   19:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:45:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:45:01 compute-0 openstack_network_exporter[199731]: ERROR   19:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:45:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:45:01 compute-0 nova_compute[187212]: 2025-11-25 19:45:01.856 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:01 compute-0 nova_compute[187212]: 2025-11-25 19:45:01.857 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:04 compute-0 nova_compute[187212]: 2025-11-25 19:45:04.402 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:04 compute-0 nova_compute[187212]: 2025-11-25 19:45:04.402 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:04 compute-0 nova_compute[187212]: 2025-11-25 19:45:04.402 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:05 compute-0 podman[221120]: 2025-11-25 19:45:05.167619576 +0000 UTC m=+0.093781428 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 19:45:06 compute-0 nova_compute[187212]: 2025-11-25 19:45:06.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:06 compute-0 nova_compute[187212]: 2025-11-25 19:45:06.859 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:06 compute-0 nova_compute[187212]: 2025-11-25 19:45:06.861 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:08 compute-0 podman[221141]: 2025-11-25 19:45:08.178213523 +0000 UTC m=+0.102973431 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Nov 25 19:45:08 compute-0 nova_compute[187212]: 2025-11-25 19:45:08.682 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:08 compute-0 nova_compute[187212]: 2025-11-25 19:45:08.683 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:45:09 compute-0 nova_compute[187212]: 2025-11-25 19:45:09.190 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:45:11 compute-0 nova_compute[187212]: 2025-11-25 19:45:11.861 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:16 compute-0 nova_compute[187212]: 2025-11-25 19:45:16.864 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:19 compute-0 podman[221161]: 2025-11-25 19:45:19.120975892 +0000 UTC m=+0.049852488 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:45:21 compute-0 nova_compute[187212]: 2025-11-25 19:45:21.866 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:21 compute-0 nova_compute[187212]: 2025-11-25 19:45:21.869 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:26 compute-0 nova_compute[187212]: 2025-11-25 19:45:26.869 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:26 compute-0 nova_compute[187212]: 2025-11-25 19:45:26.870 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:27 compute-0 podman[221185]: 2025-11-25 19:45:27.198449035 +0000 UTC m=+0.122361943 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Nov 25 19:45:29 compute-0 podman[197585]: time="2025-11-25T19:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:45:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:45:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3091 "" "Go-http-client/1.1"
Nov 25 19:45:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:45:31.157 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:45:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:45:31.159 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:45:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:45:31.159 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:45:31 compute-0 podman[221212]: 2025-11-25 19:45:31.176093599 +0000 UTC m=+0.090501732 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:45:31 compute-0 openstack_network_exporter[199731]: ERROR   19:45:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:45:31 compute-0 openstack_network_exporter[199731]: ERROR   19:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:45:31 compute-0 openstack_network_exporter[199731]: ERROR   19:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:45:31 compute-0 openstack_network_exporter[199731]: ERROR   19:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:45:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:45:31 compute-0 openstack_network_exporter[199731]: ERROR   19:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:45:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:45:31 compute-0 nova_compute[187212]: 2025-11-25 19:45:31.872 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:31 compute-0 nova_compute[187212]: 2025-11-25 19:45:31.874 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:31 compute-0 nova_compute[187212]: 2025-11-25 19:45:31.874 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:45:31 compute-0 nova_compute[187212]: 2025-11-25 19:45:31.875 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:45:31 compute-0 nova_compute[187212]: 2025-11-25 19:45:31.912 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:31 compute-0 nova_compute[187212]: 2025-11-25 19:45:31.913 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:45:36 compute-0 podman[221232]: 2025-11-25 19:45:36.160872875 +0000 UTC m=+0.080385524 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, vcs-type=git, release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:45:36 compute-0 nova_compute[187212]: 2025-11-25 19:45:36.913 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:39 compute-0 podman[221254]: 2025-11-25 19:45:39.183243604 +0000 UTC m=+0.092084534 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 25 19:45:40 compute-0 sshd-session[221274]: Connection closed by authenticating user daemon 209.38.103.174 port 54914 [preauth]
Nov 25 19:45:41 compute-0 nova_compute[187212]: 2025-11-25 19:45:41.915 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:41 compute-0 nova_compute[187212]: 2025-11-25 19:45:41.918 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:41 compute-0 nova_compute[187212]: 2025-11-25 19:45:41.919 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:45:41 compute-0 nova_compute[187212]: 2025-11-25 19:45:41.919 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:45:41 compute-0 nova_compute[187212]: 2025-11-25 19:45:41.970 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:41 compute-0 nova_compute[187212]: 2025-11-25 19:45:41.970 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:45:46 compute-0 nova_compute[187212]: 2025-11-25 19:45:46.971 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:45:46 compute-0 nova_compute[187212]: 2025-11-25 19:45:46.973 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:46 compute-0 nova_compute[187212]: 2025-11-25 19:45:46.973 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:45:46 compute-0 nova_compute[187212]: 2025-11-25 19:45:46.973 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:45:46 compute-0 nova_compute[187212]: 2025-11-25 19:45:46.974 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:45:46 compute-0 nova_compute[187212]: 2025-11-25 19:45:46.975 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:47 compute-0 nova_compute[187212]: 2025-11-25 19:45:47.681 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:50 compute-0 podman[221276]: 2025-11-25 19:45:50.165249953 +0000 UTC m=+0.080476176 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:45:50 compute-0 nova_compute[187212]: 2025-11-25 19:45:50.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:51 compute-0 nova_compute[187212]: 2025-11-25 19:45:51.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:51 compute-0 nova_compute[187212]: 2025-11-25 19:45:51.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:45:51 compute-0 nova_compute[187212]: 2025-11-25 19:45:51.976 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:52 compute-0 nova_compute[187212]: 2025-11-25 19:45:52.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:55 compute-0 nova_compute[187212]: 2025-11-25 19:45:55.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:45:55 compute-0 nova_compute[187212]: 2025-11-25 19:45:55.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:45:55 compute-0 nova_compute[187212]: 2025-11-25 19:45:55.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:45:55 compute-0 nova_compute[187212]: 2025-11-25 19:45:55.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:45:55 compute-0 nova_compute[187212]: 2025-11-25 19:45:55.692 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:45:56 compute-0 nova_compute[187212]: 2025-11-25 19:45:56.945 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:45:56 compute-0 nova_compute[187212]: 2025-11-25 19:45:56.977 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:45:57 compute-0 nova_compute[187212]: 2025-11-25 19:45:57.027 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:45:57 compute-0 nova_compute[187212]: 2025-11-25 19:45:57.028 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:45:57 compute-0 nova_compute[187212]: 2025-11-25 19:45:57.099 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:45:57 compute-0 nova_compute[187212]: 2025-11-25 19:45:57.320 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:45:57 compute-0 nova_compute[187212]: 2025-11-25 19:45:57.322 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:45:57 compute-0 nova_compute[187212]: 2025-11-25 19:45:57.356 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:45:57 compute-0 nova_compute[187212]: 2025-11-25 19:45:57.357 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5603MB free_disk=72.96124267578125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:45:57 compute-0 nova_compute[187212]: 2025-11-25 19:45:57.357 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:45:57 compute-0 nova_compute[187212]: 2025-11-25 19:45:57.358 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:45:58 compute-0 podman[221307]: 2025-11-25 19:45:58.285932226 +0000 UTC m=+0.204373399 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 19:45:58 compute-0 nova_compute[187212]: 2025-11-25 19:45:58.983 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:45:58 compute-0 nova_compute[187212]: 2025-11-25 19:45:58.984 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:45:58 compute-0 nova_compute[187212]: 2025-11-25 19:45:58.984 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:45:57 up  1:38,  0 user,  load average: 0.00, 0.06, 0.17\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:45:59 compute-0 nova_compute[187212]: 2025-11-25 19:45:59.069 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:45:59 compute-0 nova_compute[187212]: 2025-11-25 19:45:59.151 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:45:59 compute-0 nova_compute[187212]: 2025-11-25 19:45:59.152 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:45:59 compute-0 nova_compute[187212]: 2025-11-25 19:45:59.177 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:45:59 compute-0 nova_compute[187212]: 2025-11-25 19:45:59.199 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:45:59 compute-0 nova_compute[187212]: 2025-11-25 19:45:59.237 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:45:59 compute-0 podman[197585]: time="2025-11-25T19:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:45:59 compute-0 nova_compute[187212]: 2025-11-25 19:45:59.747 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:45:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:45:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3086 "" "Go-http-client/1.1"
Nov 25 19:46:00 compute-0 nova_compute[187212]: 2025-11-25 19:46:00.258 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:46:00 compute-0 nova_compute[187212]: 2025-11-25 19:46:00.259 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.901s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:46:01 compute-0 openstack_network_exporter[199731]: ERROR   19:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:46:01 compute-0 openstack_network_exporter[199731]: ERROR   19:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:46:01 compute-0 openstack_network_exporter[199731]: ERROR   19:46:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:46:01 compute-0 openstack_network_exporter[199731]: ERROR   19:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:46:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:46:01 compute-0 openstack_network_exporter[199731]: ERROR   19:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:46:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:46:01 compute-0 nova_compute[187212]: 2025-11-25 19:46:01.979 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:46:01 compute-0 nova_compute[187212]: 2025-11-25 19:46:01.981 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:46:01 compute-0 nova_compute[187212]: 2025-11-25 19:46:01.982 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:46:01 compute-0 nova_compute[187212]: 2025-11-25 19:46:01.982 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:46:02 compute-0 nova_compute[187212]: 2025-11-25 19:46:02.004 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:02 compute-0 nova_compute[187212]: 2025-11-25 19:46:02.005 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:46:02 compute-0 podman[221335]: 2025-11-25 19:46:02.158043021 +0000 UTC m=+0.081697419 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:46:03 compute-0 nova_compute[187212]: 2025-11-25 19:46:03.255 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:46:03 compute-0 nova_compute[187212]: 2025-11-25 19:46:03.767 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:46:03 compute-0 nova_compute[187212]: 2025-11-25 19:46:03.768 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:46:03 compute-0 nova_compute[187212]: 2025-11-25 19:46:03.768 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:46:07 compute-0 nova_compute[187212]: 2025-11-25 19:46:07.006 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:46:07 compute-0 nova_compute[187212]: 2025-11-25 19:46:07.008 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:07 compute-0 podman[221355]: 2025-11-25 19:46:07.145391886 +0000 UTC m=+0.065022789 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 25 19:46:10 compute-0 podman[221377]: 2025-11-25 19:46:10.183750437 +0000 UTC m=+0.102103328 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 25 19:46:12 compute-0 nova_compute[187212]: 2025-11-25 19:46:12.055 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:46:17 compute-0 nova_compute[187212]: 2025-11-25 19:46:17.059 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:46:17 compute-0 nova_compute[187212]: 2025-11-25 19:46:17.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:17 compute-0 nova_compute[187212]: 2025-11-25 19:46:17.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:46:17 compute-0 nova_compute[187212]: 2025-11-25 19:46:17.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:46:17 compute-0 nova_compute[187212]: 2025-11-25 19:46:17.061 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:46:17 compute-0 nova_compute[187212]: 2025-11-25 19:46:17.062 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:20 compute-0 sshd-session[221397]: Connection closed by authenticating user daemon 209.38.103.174 port 60810 [preauth]
Nov 25 19:46:21 compute-0 podman[221399]: 2025-11-25 19:46:21.152955196 +0000 UTC m=+0.072270350 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:46:22 compute-0 nova_compute[187212]: 2025-11-25 19:46:22.062 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:27 compute-0 nova_compute[187212]: 2025-11-25 19:46:27.064 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:29 compute-0 podman[221423]: 2025-11-25 19:46:29.276050464 +0000 UTC m=+0.190253157 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Nov 25 19:46:29 compute-0 podman[197585]: time="2025-11-25T19:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:46:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:46:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3088 "" "Go-http-client/1.1"
Nov 25 19:46:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:46:31.161 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:46:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:46:31.161 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:46:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:46:31.163 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:46:31 compute-0 openstack_network_exporter[199731]: ERROR   19:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:46:31 compute-0 openstack_network_exporter[199731]: ERROR   19:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:46:31 compute-0 openstack_network_exporter[199731]: ERROR   19:46:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:46:31 compute-0 openstack_network_exporter[199731]: ERROR   19:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:46:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:46:31 compute-0 openstack_network_exporter[199731]: ERROR   19:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:46:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:46:32 compute-0 nova_compute[187212]: 2025-11-25 19:46:32.066 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:46:33 compute-0 podman[221452]: 2025-11-25 19:46:33.16114025 +0000 UTC m=+0.074854018 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:46:37 compute-0 nova_compute[187212]: 2025-11-25 19:46:37.068 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:38 compute-0 podman[221472]: 2025-11-25 19:46:38.137517984 +0000 UTC m=+0.064288319 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Nov 25 19:46:41 compute-0 podman[221493]: 2025-11-25 19:46:41.137157641 +0000 UTC m=+0.065843141 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Nov 25 19:46:42 compute-0 nova_compute[187212]: 2025-11-25 19:46:42.071 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:47 compute-0 nova_compute[187212]: 2025-11-25 19:46:47.073 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:46:47 compute-0 nova_compute[187212]: 2025-11-25 19:46:47.104 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:46:47 compute-0 nova_compute[187212]: 2025-11-25 19:46:47.105 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:46:47 compute-0 nova_compute[187212]: 2025-11-25 19:46:47.105 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:46:47 compute-0 nova_compute[187212]: 2025-11-25 19:46:47.106 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:47 compute-0 nova_compute[187212]: 2025-11-25 19:46:47.106 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:46:47 compute-0 nova_compute[187212]: 2025-11-25 19:46:47.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:46:51 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:46:51.594 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:46:51 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:46:51.594 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:46:51 compute-0 nova_compute[187212]: 2025-11-25 19:46:51.596 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:52 compute-0 nova_compute[187212]: 2025-11-25 19:46:52.107 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:52 compute-0 podman[221516]: 2025-11-25 19:46:52.149620303 +0000 UTC m=+0.066228170 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:46:52 compute-0 nova_compute[187212]: 2025-11-25 19:46:52.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:46:53 compute-0 ovn_controller[95465]: 2025-11-25T19:46:53Z|00178|binding|INFO|Releasing lport f3db0a73-6d5e-44f5-a754-565ad86befff from this chassis (sb_readonly=0)
Nov 25 19:46:53 compute-0 nova_compute[187212]: 2025-11-25 19:46:53.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:46:53 compute-0 nova_compute[187212]: 2025-11-25 19:46:53.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:46:53 compute-0 nova_compute[187212]: 2025-11-25 19:46:53.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:46:53 compute-0 nova_compute[187212]: 2025-11-25 19:46:53.201 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:55 compute-0 nova_compute[187212]: 2025-11-25 19:46:55.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:46:55 compute-0 nova_compute[187212]: 2025-11-25 19:46:55.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:46:55 compute-0 nova_compute[187212]: 2025-11-25 19:46:55.695 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:46:55 compute-0 nova_compute[187212]: 2025-11-25 19:46:55.695 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:46:55 compute-0 nova_compute[187212]: 2025-11-25 19:46:55.695 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:46:56 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:46:56.595 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:46:56 compute-0 nova_compute[187212]: 2025-11-25 19:46:56.747 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:46:56 compute-0 nova_compute[187212]: 2025-11-25 19:46:56.803 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:46:56 compute-0 nova_compute[187212]: 2025-11-25 19:46:56.804 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:46:56 compute-0 nova_compute[187212]: 2025-11-25 19:46:56.862 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:46:57 compute-0 nova_compute[187212]: 2025-11-25 19:46:57.018 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:46:57 compute-0 nova_compute[187212]: 2025-11-25 19:46:57.020 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:46:57 compute-0 nova_compute[187212]: 2025-11-25 19:46:57.037 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:46:57 compute-0 nova_compute[187212]: 2025-11-25 19:46:57.038 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5609MB free_disk=72.9612922668457GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:46:57 compute-0 nova_compute[187212]: 2025-11-25 19:46:57.038 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:46:57 compute-0 nova_compute[187212]: 2025-11-25 19:46:57.039 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:46:57 compute-0 nova_compute[187212]: 2025-11-25 19:46:57.109 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:46:58 compute-0 nova_compute[187212]: 2025-11-25 19:46:58.591 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:46:58 compute-0 nova_compute[187212]: 2025-11-25 19:46:58.591 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:46:58 compute-0 nova_compute[187212]: 2025-11-25 19:46:58.592 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:46:57 up  1:39,  0 user,  load average: 0.00, 0.05, 0.16\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:46:58 compute-0 nova_compute[187212]: 2025-11-25 19:46:58.631 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:46:59 compute-0 nova_compute[187212]: 2025-11-25 19:46:59.159 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:46:59 compute-0 nova_compute[187212]: 2025-11-25 19:46:59.674 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:46:59 compute-0 nova_compute[187212]: 2025-11-25 19:46:59.675 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.636s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:46:59 compute-0 podman[197585]: time="2025-11-25T19:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:46:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:46:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3090 "" "Go-http-client/1.1"
Nov 25 19:47:00 compute-0 podman[221548]: 2025-11-25 19:47:00.249000443 +0000 UTC m=+0.163728376 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 19:47:00 compute-0 sshd-session[221573]: Connection closed by authenticating user daemon 209.38.103.174 port 45724 [preauth]
Nov 25 19:47:01 compute-0 openstack_network_exporter[199731]: ERROR   19:47:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:47:01 compute-0 openstack_network_exporter[199731]: ERROR   19:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:47:01 compute-0 openstack_network_exporter[199731]: ERROR   19:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:47:01 compute-0 openstack_network_exporter[199731]: ERROR   19:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:47:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:47:01 compute-0 openstack_network_exporter[199731]: ERROR   19:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:47:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:47:02 compute-0 nova_compute[187212]: 2025-11-25 19:47:02.111 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:47:02 compute-0 nova_compute[187212]: 2025-11-25 19:47:02.113 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:47:02 compute-0 nova_compute[187212]: 2025-11-25 19:47:02.113 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:47:02 compute-0 nova_compute[187212]: 2025-11-25 19:47:02.113 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:47:02 compute-0 nova_compute[187212]: 2025-11-25 19:47:02.173 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:02 compute-0 nova_compute[187212]: 2025-11-25 19:47:02.174 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:47:04 compute-0 podman[221577]: 2025-11-25 19:47:04.188021213 +0000 UTC m=+0.093403808 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 19:47:05 compute-0 nova_compute[187212]: 2025-11-25 19:47:05.676 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:47:05 compute-0 nova_compute[187212]: 2025-11-25 19:47:05.676 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:47:05 compute-0 nova_compute[187212]: 2025-11-25 19:47:05.677 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:47:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:06.054 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:ee:a5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-457d8244-6bb8-4051-be6f-0207db7d7c86', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-457d8244-6bb8-4051-be6f-0207db7d7c86', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e128b69f237144e3ad7bb4f69ac14026', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=889d46c5-159b-46c4-941a-0ab235156756, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=679412ab-1d19-464d-b6c8-885deeec3d88) old=Port_Binding(mac=['fa:16:3e:81:ee:a5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-457d8244-6bb8-4051-be6f-0207db7d7c86', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-457d8244-6bb8-4051-be6f-0207db7d7c86', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e128b69f237144e3ad7bb4f69ac14026', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:47:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:06.056 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 679412ab-1d19-464d-b6c8-885deeec3d88 in datapath 457d8244-6bb8-4051-be6f-0207db7d7c86 updated
Nov 25 19:47:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:06.058 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 457d8244-6bb8-4051-be6f-0207db7d7c86, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:47:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:06.060 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf45c93-6232-48ab-a96f-293dffcee728]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:47:07 compute-0 nova_compute[187212]: 2025-11-25 19:47:07.174 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:07 compute-0 nova_compute[187212]: 2025-11-25 19:47:07.176 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:09 compute-0 podman[221596]: 2025-11-25 19:47:09.154097025 +0000 UTC m=+0.073976375 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 19:47:12 compute-0 nova_compute[187212]: 2025-11-25 19:47:12.177 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:47:12 compute-0 nova_compute[187212]: 2025-11-25 19:47:12.179 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:47:12 compute-0 nova_compute[187212]: 2025-11-25 19:47:12.179 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:47:12 compute-0 nova_compute[187212]: 2025-11-25 19:47:12.179 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:47:12 compute-0 podman[221617]: 2025-11-25 19:47:12.18506619 +0000 UTC m=+0.100144557 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 19:47:12 compute-0 nova_compute[187212]: 2025-11-25 19:47:12.223 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:12 compute-0 nova_compute[187212]: 2025-11-25 19:47:12.224 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:47:13 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:13.782 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:57:fa 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-793c3945-a5ea-44cb-985e-8575e82d0e9d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-793c3945-a5ea-44cb-985e-8575e82d0e9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d0d9e1b8b0946c2a35b0680c90c9512', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9ae83d2-3446-4a76-96c7-6c0a4eac9a76, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=63b577bb-abfe-4ae1-9642-00a58821d0d3) old=Port_Binding(mac=['fa:16:3e:0c:57:fa'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-793c3945-a5ea-44cb-985e-8575e82d0e9d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-793c3945-a5ea-44cb-985e-8575e82d0e9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d0d9e1b8b0946c2a35b0680c90c9512', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:47:13 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:13.784 104356 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 63b577bb-abfe-4ae1-9642-00a58821d0d3 in datapath 793c3945-a5ea-44cb-985e-8575e82d0e9d updated
Nov 25 19:47:13 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:13.786 104356 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 793c3945-a5ea-44cb-985e-8575e82d0e9d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Nov 25 19:47:13 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:13.787 208756 DEBUG oslo.privsep.daemon [-] privsep: reply[f44f5b6c-0728-41af-8b71-239610958918]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Nov 25 19:47:17 compute-0 nova_compute[187212]: 2025-11-25 19:47:17.224 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:47:17 compute-0 nova_compute[187212]: 2025-11-25 19:47:17.226 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:47:17 compute-0 nova_compute[187212]: 2025-11-25 19:47:17.227 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:47:17 compute-0 nova_compute[187212]: 2025-11-25 19:47:17.227 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:47:17 compute-0 nova_compute[187212]: 2025-11-25 19:47:17.277 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:17 compute-0 nova_compute[187212]: 2025-11-25 19:47:17.278 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:47:22 compute-0 nova_compute[187212]: 2025-11-25 19:47:22.278 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:23 compute-0 podman[221637]: 2025-11-25 19:47:23.173834098 +0000 UTC m=+0.097727093 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:47:27 compute-0 nova_compute[187212]: 2025-11-25 19:47:27.280 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:29 compute-0 podman[197585]: time="2025-11-25T19:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:47:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:47:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3094 "" "Go-http-client/1.1"
Nov 25 19:47:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:31.164 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:47:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:31.165 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:47:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:47:31.166 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:47:31 compute-0 podman[221663]: 2025-11-25 19:47:31.230465888 +0000 UTC m=+0.145572986 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest)
Nov 25 19:47:31 compute-0 openstack_network_exporter[199731]: ERROR   19:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:47:31 compute-0 openstack_network_exporter[199731]: ERROR   19:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:47:31 compute-0 openstack_network_exporter[199731]: ERROR   19:47:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:47:31 compute-0 openstack_network_exporter[199731]: ERROR   19:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:47:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:47:31 compute-0 openstack_network_exporter[199731]: ERROR   19:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:47:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:47:32 compute-0 nova_compute[187212]: 2025-11-25 19:47:32.282 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:35 compute-0 podman[221690]: 2025-11-25 19:47:35.164944382 +0000 UTC m=+0.086395343 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:47:37 compute-0 nova_compute[187212]: 2025-11-25 19:47:37.284 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:39 compute-0 sshd-session[221710]: Connection closed by authenticating user daemon 209.38.103.174 port 44258 [preauth]
Nov 25 19:47:40 compute-0 podman[221712]: 2025-11-25 19:47:40.170612512 +0000 UTC m=+0.094372194 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Nov 25 19:47:42 compute-0 nova_compute[187212]: 2025-11-25 19:47:42.286 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:47:42 compute-0 nova_compute[187212]: 2025-11-25 19:47:42.288 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:42 compute-0 nova_compute[187212]: 2025-11-25 19:47:42.288 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:47:42 compute-0 nova_compute[187212]: 2025-11-25 19:47:42.288 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:47:42 compute-0 nova_compute[187212]: 2025-11-25 19:47:42.289 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:47:42 compute-0 nova_compute[187212]: 2025-11-25 19:47:42.290 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:43 compute-0 podman[221734]: 2025-11-25 19:47:43.181008122 +0000 UTC m=+0.093389038 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Nov 25 19:47:47 compute-0 nova_compute[187212]: 2025-11-25 19:47:47.289 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:47 compute-0 nova_compute[187212]: 2025-11-25 19:47:47.291 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:48 compute-0 nova_compute[187212]: 2025-11-25 19:47:48.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:47:52 compute-0 nova_compute[187212]: 2025-11-25 19:47:52.291 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:53 compute-0 nova_compute[187212]: 2025-11-25 19:47:53.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:47:54 compute-0 nova_compute[187212]: 2025-11-25 19:47:54.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:47:54 compute-0 nova_compute[187212]: 2025-11-25 19:47:54.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:47:54 compute-0 nova_compute[187212]: 2025-11-25 19:47:54.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:47:54 compute-0 podman[221755]: 2025-11-25 19:47:54.186111508 +0000 UTC m=+0.089477474 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:47:57 compute-0 nova_compute[187212]: 2025-11-25 19:47:57.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:47:57 compute-0 nova_compute[187212]: 2025-11-25 19:47:57.294 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:47:57 compute-0 nova_compute[187212]: 2025-11-25 19:47:57.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:47:57 compute-0 nova_compute[187212]: 2025-11-25 19:47:57.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:47:57 compute-0 nova_compute[187212]: 2025-11-25 19:47:57.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:47:57 compute-0 nova_compute[187212]: 2025-11-25 19:47:57.691 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:47:58 compute-0 nova_compute[187212]: 2025-11-25 19:47:58.748 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:47:58 compute-0 nova_compute[187212]: 2025-11-25 19:47:58.808 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:47:58 compute-0 nova_compute[187212]: 2025-11-25 19:47:58.810 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:47:58 compute-0 nova_compute[187212]: 2025-11-25 19:47:58.876 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:47:59 compute-0 nova_compute[187212]: 2025-11-25 19:47:59.112 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:47:59 compute-0 nova_compute[187212]: 2025-11-25 19:47:59.114 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:47:59 compute-0 nova_compute[187212]: 2025-11-25 19:47:59.160 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:47:59 compute-0 nova_compute[187212]: 2025-11-25 19:47:59.161 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5605MB free_disk=72.96128845214844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:47:59 compute-0 nova_compute[187212]: 2025-11-25 19:47:59.161 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:47:59 compute-0 nova_compute[187212]: 2025-11-25 19:47:59.162 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:47:59 compute-0 podman[197585]: time="2025-11-25T19:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:47:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:47:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3091 "" "Go-http-client/1.1"
Nov 25 19:48:00 compute-0 nova_compute[187212]: 2025-11-25 19:48:00.751 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:48:00 compute-0 nova_compute[187212]: 2025-11-25 19:48:00.751 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:48:00 compute-0 nova_compute[187212]: 2025-11-25 19:48:00.752 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:47:59 up  1:40,  0 user,  load average: 0.15, 0.08, 0.16\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:48:00 compute-0 nova_compute[187212]: 2025-11-25 19:48:00.815 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:48:01 compute-0 nova_compute[187212]: 2025-11-25 19:48:01.323 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:48:01 compute-0 openstack_network_exporter[199731]: ERROR   19:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:48:01 compute-0 openstack_network_exporter[199731]: ERROR   19:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:48:01 compute-0 openstack_network_exporter[199731]: ERROR   19:48:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:48:01 compute-0 openstack_network_exporter[199731]: ERROR   19:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:48:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:48:01 compute-0 openstack_network_exporter[199731]: ERROR   19:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:48:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:48:01 compute-0 nova_compute[187212]: 2025-11-25 19:48:01.834 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:48:01 compute-0 nova_compute[187212]: 2025-11-25 19:48:01.834 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.673s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:48:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:48:02.278 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:48:02 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:48:02.278 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:48:02 compute-0 nova_compute[187212]: 2025-11-25 19:48:02.279 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:02 compute-0 podman[221786]: 2025-11-25 19:48:02.282560351 +0000 UTC m=+0.196632845 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4)
Nov 25 19:48:02 compute-0 nova_compute[187212]: 2025-11-25 19:48:02.296 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:03 compute-0 ovn_controller[95465]: 2025-11-25T19:48:03Z|00179|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 25 19:48:04 compute-0 nova_compute[187212]: 2025-11-25 19:48:04.831 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:48:05 compute-0 nova_compute[187212]: 2025-11-25 19:48:05.345 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:48:05 compute-0 nova_compute[187212]: 2025-11-25 19:48:05.346 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:48:06 compute-0 podman[221814]: 2025-11-25 19:48:06.139195416 +0000 UTC m=+0.068152321 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 25 19:48:06 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:48:06.280 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:48:07 compute-0 nova_compute[187212]: 2025-11-25 19:48:07.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:48:07 compute-0 nova_compute[187212]: 2025-11-25 19:48:07.298 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:11 compute-0 podman[221835]: 2025-11-25 19:48:11.189528386 +0000 UTC m=+0.101704028 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, version=9.6)
Nov 25 19:48:12 compute-0 nova_compute[187212]: 2025-11-25 19:48:12.301 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:12 compute-0 nova_compute[187212]: 2025-11-25 19:48:12.302 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:12 compute-0 nova_compute[187212]: 2025-11-25 19:48:12.303 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:48:12 compute-0 nova_compute[187212]: 2025-11-25 19:48:12.303 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:12 compute-0 nova_compute[187212]: 2025-11-25 19:48:12.352 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:12 compute-0 nova_compute[187212]: 2025-11-25 19:48:12.353 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:14 compute-0 podman[221856]: 2025-11-25 19:48:14.174974457 +0000 UTC m=+0.095791551 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=watcher_latest)
Nov 25 19:48:17 compute-0 nova_compute[187212]: 2025-11-25 19:48:17.354 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:17 compute-0 nova_compute[187212]: 2025-11-25 19:48:17.355 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:18 compute-0 sshd-session[221877]: Connection closed by authenticating user daemon 209.38.103.174 port 49178 [preauth]
Nov 25 19:48:22 compute-0 nova_compute[187212]: 2025-11-25 19:48:22.356 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:22 compute-0 nova_compute[187212]: 2025-11-25 19:48:22.359 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:25 compute-0 podman[221879]: 2025-11-25 19:48:25.143079377 +0000 UTC m=+0.070692558 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:48:27 compute-0 nova_compute[187212]: 2025-11-25 19:48:27.360 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:27 compute-0 nova_compute[187212]: 2025-11-25 19:48:27.361 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:27 compute-0 nova_compute[187212]: 2025-11-25 19:48:27.361 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:48:27 compute-0 nova_compute[187212]: 2025-11-25 19:48:27.361 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:27 compute-0 nova_compute[187212]: 2025-11-25 19:48:27.363 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:27 compute-0 nova_compute[187212]: 2025-11-25 19:48:27.364 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:29 compute-0 podman[197585]: time="2025-11-25T19:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:48:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:48:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3092 "" "Go-http-client/1.1"
Nov 25 19:48:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:48:31.168 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:48:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:48:31.169 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:48:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:48:31.170 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:48:31 compute-0 openstack_network_exporter[199731]: ERROR   19:48:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:48:31 compute-0 openstack_network_exporter[199731]: ERROR   19:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:48:31 compute-0 openstack_network_exporter[199731]: ERROR   19:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:48:31 compute-0 openstack_network_exporter[199731]: ERROR   19:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:48:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:48:31 compute-0 openstack_network_exporter[199731]: ERROR   19:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:48:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:48:32 compute-0 nova_compute[187212]: 2025-11-25 19:48:32.364 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:32 compute-0 nova_compute[187212]: 2025-11-25 19:48:32.366 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:32 compute-0 nova_compute[187212]: 2025-11-25 19:48:32.367 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:48:32 compute-0 nova_compute[187212]: 2025-11-25 19:48:32.367 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:32 compute-0 nova_compute[187212]: 2025-11-25 19:48:32.416 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:32 compute-0 nova_compute[187212]: 2025-11-25 19:48:32.417 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:33 compute-0 podman[221905]: 2025-11-25 19:48:33.243957987 +0000 UTC m=+0.153968538 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4)
Nov 25 19:48:37 compute-0 podman[221933]: 2025-11-25 19:48:37.126149449 +0000 UTC m=+0.055006594 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest)
Nov 25 19:48:37 compute-0 nova_compute[187212]: 2025-11-25 19:48:37.417 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:42 compute-0 podman[221953]: 2025-11-25 19:48:42.182647898 +0000 UTC m=+0.102670102 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Nov 25 19:48:42 compute-0 nova_compute[187212]: 2025-11-25 19:48:42.420 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:42 compute-0 nova_compute[187212]: 2025-11-25 19:48:42.422 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:42 compute-0 nova_compute[187212]: 2025-11-25 19:48:42.422 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:48:42 compute-0 nova_compute[187212]: 2025-11-25 19:48:42.422 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:42 compute-0 nova_compute[187212]: 2025-11-25 19:48:42.445 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:42 compute-0 nova_compute[187212]: 2025-11-25 19:48:42.446 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:45 compute-0 podman[221976]: 2025-11-25 19:48:45.15821338 +0000 UTC m=+0.074876369 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd)
Nov 25 19:48:47 compute-0 nova_compute[187212]: 2025-11-25 19:48:47.446 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:48 compute-0 nova_compute[187212]: 2025-11-25 19:48:48.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:48:52 compute-0 nova_compute[187212]: 2025-11-25 19:48:52.448 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:52 compute-0 nova_compute[187212]: 2025-11-25 19:48:52.449 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:52 compute-0 nova_compute[187212]: 2025-11-25 19:48:52.449 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:48:52 compute-0 nova_compute[187212]: 2025-11-25 19:48:52.450 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:52 compute-0 nova_compute[187212]: 2025-11-25 19:48:52.450 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:48:52 compute-0 nova_compute[187212]: 2025-11-25 19:48:52.451 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:48:54 compute-0 nova_compute[187212]: 2025-11-25 19:48:54.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:48:54 compute-0 nova_compute[187212]: 2025-11-25 19:48:54.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:48:54 compute-0 nova_compute[187212]: 2025-11-25 19:48:54.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:48:55 compute-0 nova_compute[187212]: 2025-11-25 19:48:55.171 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:48:56 compute-0 podman[221996]: 2025-11-25 19:48:56.141982493 +0000 UTC m=+0.065276476 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:48:57 compute-0 nova_compute[187212]: 2025-11-25 19:48:57.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:48:57 compute-0 nova_compute[187212]: 2025-11-25 19:48:57.452 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:48:57 compute-0 sshd-session[222020]: Connection closed by authenticating user daemon 209.38.103.174 port 42426 [preauth]
Nov 25 19:48:57 compute-0 nova_compute[187212]: 2025-11-25 19:48:57.728 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:48:57 compute-0 nova_compute[187212]: 2025-11-25 19:48:57.729 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:48:57 compute-0 nova_compute[187212]: 2025-11-25 19:48:57.729 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:48:57 compute-0 nova_compute[187212]: 2025-11-25 19:48:57.730 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:48:58 compute-0 nova_compute[187212]: 2025-11-25 19:48:58.780 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:48:58 compute-0 nova_compute[187212]: 2025-11-25 19:48:58.873 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:48:58 compute-0 nova_compute[187212]: 2025-11-25 19:48:58.875 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:48:58 compute-0 nova_compute[187212]: 2025-11-25 19:48:58.941 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:48:59 compute-0 nova_compute[187212]: 2025-11-25 19:48:59.182 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:48:59 compute-0 nova_compute[187212]: 2025-11-25 19:48:59.184 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:48:59 compute-0 nova_compute[187212]: 2025-11-25 19:48:59.220 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:48:59 compute-0 nova_compute[187212]: 2025-11-25 19:48:59.221 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5606MB free_disk=72.96123886108398GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:48:59 compute-0 nova_compute[187212]: 2025-11-25 19:48:59.221 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:48:59 compute-0 nova_compute[187212]: 2025-11-25 19:48:59.222 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:48:59 compute-0 podman[197585]: time="2025-11-25T19:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:48:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:48:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3087 "" "Go-http-client/1.1"
Nov 25 19:49:00 compute-0 nova_compute[187212]: 2025-11-25 19:49:00.795 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:49:00 compute-0 nova_compute[187212]: 2025-11-25 19:49:00.796 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:49:00 compute-0 nova_compute[187212]: 2025-11-25 19:49:00.796 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:48:59 up  1:41,  0 user,  load average: 0.16, 0.10, 0.16\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:49:00 compute-0 nova_compute[187212]: 2025-11-25 19:49:00.845 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:49:01 compute-0 nova_compute[187212]: 2025-11-25 19:49:01.352 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:49:01 compute-0 openstack_network_exporter[199731]: ERROR   19:49:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:49:01 compute-0 openstack_network_exporter[199731]: ERROR   19:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:49:01 compute-0 openstack_network_exporter[199731]: ERROR   19:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:49:01 compute-0 openstack_network_exporter[199731]: ERROR   19:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:49:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:49:01 compute-0 openstack_network_exporter[199731]: ERROR   19:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:49:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:49:01 compute-0 nova_compute[187212]: 2025-11-25 19:49:01.865 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:49:01 compute-0 nova_compute[187212]: 2025-11-25 19:49:01.866 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.644s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:49:02 compute-0 nova_compute[187212]: 2025-11-25 19:49:02.454 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:02 compute-0 nova_compute[187212]: 2025-11-25 19:49:02.456 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:02 compute-0 nova_compute[187212]: 2025-11-25 19:49:02.457 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:49:02 compute-0 nova_compute[187212]: 2025-11-25 19:49:02.457 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:49:02 compute-0 nova_compute[187212]: 2025-11-25 19:49:02.503 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:02 compute-0 nova_compute[187212]: 2025-11-25 19:49:02.505 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:49:04 compute-0 podman[222029]: 2025-11-25 19:49:04.230171856 +0000 UTC m=+0.148387131 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125)
Nov 25 19:49:07 compute-0 nova_compute[187212]: 2025-11-25 19:49:07.506 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:07 compute-0 nova_compute[187212]: 2025-11-25 19:49:07.508 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:07 compute-0 nova_compute[187212]: 2025-11-25 19:49:07.508 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:49:07 compute-0 nova_compute[187212]: 2025-11-25 19:49:07.508 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:49:07 compute-0 nova_compute[187212]: 2025-11-25 19:49:07.553 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:07 compute-0 nova_compute[187212]: 2025-11-25 19:49:07.554 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:49:07 compute-0 nova_compute[187212]: 2025-11-25 19:49:07.866 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:49:07 compute-0 nova_compute[187212]: 2025-11-25 19:49:07.867 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:49:07 compute-0 nova_compute[187212]: 2025-11-25 19:49:07.867 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:49:08 compute-0 podman[222057]: 2025-11-25 19:49:08.170786267 +0000 UTC m=+0.093319006 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:49:12 compute-0 nova_compute[187212]: 2025-11-25 19:49:12.602 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:12 compute-0 nova_compute[187212]: 2025-11-25 19:49:12.604 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:12 compute-0 nova_compute[187212]: 2025-11-25 19:49:12.604 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5050 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:49:12 compute-0 nova_compute[187212]: 2025-11-25 19:49:12.605 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:49:12 compute-0 nova_compute[187212]: 2025-11-25 19:49:12.605 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:49:12 compute-0 nova_compute[187212]: 2025-11-25 19:49:12.607 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:13 compute-0 podman[222077]: 2025-11-25 19:49:13.175082321 +0000 UTC m=+0.095112194 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:49:16 compute-0 podman[222099]: 2025-11-25 19:49:16.155739076 +0000 UTC m=+0.083552608 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:49:17 compute-0 nova_compute[187212]: 2025-11-25 19:49:17.607 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:22 compute-0 nova_compute[187212]: 2025-11-25 19:49:22.609 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:27 compute-0 podman[222119]: 2025-11-25 19:49:27.157227689 +0000 UTC m=+0.074174291 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:49:27 compute-0 nova_compute[187212]: 2025-11-25 19:49:27.611 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:29 compute-0 podman[197585]: time="2025-11-25T19:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:49:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:49:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3090 "" "Go-http-client/1.1"
Nov 25 19:49:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:49:31.172 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:49:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:49:31.172 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:49:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:49:31.173 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:49:31 compute-0 openstack_network_exporter[199731]: ERROR   19:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:49:31 compute-0 openstack_network_exporter[199731]: ERROR   19:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:49:31 compute-0 openstack_network_exporter[199731]: ERROR   19:49:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:49:31 compute-0 openstack_network_exporter[199731]: ERROR   19:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:49:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:49:31 compute-0 openstack_network_exporter[199731]: ERROR   19:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:49:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:49:32 compute-0 nova_compute[187212]: 2025-11-25 19:49:32.613 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:32 compute-0 nova_compute[187212]: 2025-11-25 19:49:32.615 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:32 compute-0 nova_compute[187212]: 2025-11-25 19:49:32.616 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:49:32 compute-0 nova_compute[187212]: 2025-11-25 19:49:32.616 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:49:32 compute-0 nova_compute[187212]: 2025-11-25 19:49:32.617 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:49:32 compute-0 nova_compute[187212]: 2025-11-25 19:49:32.618 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:35 compute-0 podman[222145]: 2025-11-25 19:49:35.20935802 +0000 UTC m=+0.134111723 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Nov 25 19:49:37 compute-0 nova_compute[187212]: 2025-11-25 19:49:37.619 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:37 compute-0 sshd-session[222171]: Connection closed by authenticating user daemon 209.38.103.174 port 58848 [preauth]
Nov 25 19:49:39 compute-0 podman[222173]: 2025-11-25 19:49:39.173002523 +0000 UTC m=+0.090279016 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 25 19:49:42 compute-0 nova_compute[187212]: 2025-11-25 19:49:42.621 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:44 compute-0 podman[222194]: 2025-11-25 19:49:44.167068503 +0000 UTC m=+0.087999416 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Nov 25 19:49:47 compute-0 podman[222215]: 2025-11-25 19:49:47.179368105 +0000 UTC m=+0.096571802 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 19:49:47 compute-0 nova_compute[187212]: 2025-11-25 19:49:47.623 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:50 compute-0 nova_compute[187212]: 2025-11-25 19:49:50.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:49:52 compute-0 nova_compute[187212]: 2025-11-25 19:49:52.625 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:49:55 compute-0 nova_compute[187212]: 2025-11-25 19:49:55.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:49:55 compute-0 nova_compute[187212]: 2025-11-25 19:49:55.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:49:55 compute-0 nova_compute[187212]: 2025-11-25 19:49:55.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:49:56 compute-0 nova_compute[187212]: 2025-11-25 19:49:56.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:49:57 compute-0 nova_compute[187212]: 2025-11-25 19:49:57.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:49:57 compute-0 nova_compute[187212]: 2025-11-25 19:49:57.627 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:49:57 compute-0 nova_compute[187212]: 2025-11-25 19:49:57.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:49:57 compute-0 nova_compute[187212]: 2025-11-25 19:49:57.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:49:57 compute-0 nova_compute[187212]: 2025-11-25 19:49:57.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:49:57 compute-0 nova_compute[187212]: 2025-11-25 19:49:57.689 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:49:58 compute-0 podman[222236]: 2025-11-25 19:49:58.156884494 +0000 UTC m=+0.082236393 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:49:58 compute-0 nova_compute[187212]: 2025-11-25 19:49:58.732 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:49:58 compute-0 nova_compute[187212]: 2025-11-25 19:49:58.806 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:49:58 compute-0 nova_compute[187212]: 2025-11-25 19:49:58.807 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:49:58 compute-0 nova_compute[187212]: 2025-11-25 19:49:58.876 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:49:59 compute-0 nova_compute[187212]: 2025-11-25 19:49:59.110 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:49:59 compute-0 nova_compute[187212]: 2025-11-25 19:49:59.112 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:49:59 compute-0 nova_compute[187212]: 2025-11-25 19:49:59.140 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:49:59 compute-0 nova_compute[187212]: 2025-11-25 19:49:59.141 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5598MB free_disk=72.96125793457031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:49:59 compute-0 nova_compute[187212]: 2025-11-25 19:49:59.141 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:49:59 compute-0 nova_compute[187212]: 2025-11-25 19:49:59.141 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:49:59 compute-0 podman[197585]: time="2025-11-25T19:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:49:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:49:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3084 "" "Go-http-client/1.1"
Nov 25 19:50:00 compute-0 nova_compute[187212]: 2025-11-25 19:50:00.729 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:50:00 compute-0 nova_compute[187212]: 2025-11-25 19:50:00.731 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:50:00 compute-0 nova_compute[187212]: 2025-11-25 19:50:00.731 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:49:59 up  1:42,  0 user,  load average: 0.06, 0.08, 0.15\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:50:00 compute-0 nova_compute[187212]: 2025-11-25 19:50:00.771 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:50:01 compute-0 nova_compute[187212]: 2025-11-25 19:50:01.291 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:50:01 compute-0 openstack_network_exporter[199731]: ERROR   19:50:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:50:01 compute-0 openstack_network_exporter[199731]: ERROR   19:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:50:01 compute-0 openstack_network_exporter[199731]: ERROR   19:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:50:01 compute-0 openstack_network_exporter[199731]: ERROR   19:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:50:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:50:01 compute-0 openstack_network_exporter[199731]: ERROR   19:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:50:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:50:01 compute-0 nova_compute[187212]: 2025-11-25 19:50:01.870 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:50:01 compute-0 nova_compute[187212]: 2025-11-25 19:50:01.871 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.729s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:50:02 compute-0 nova_compute[187212]: 2025-11-25 19:50:02.629 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:02 compute-0 nova_compute[187212]: 2025-11-25 19:50:02.671 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:02 compute-0 nova_compute[187212]: 2025-11-25 19:50:02.671 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:50:02 compute-0 nova_compute[187212]: 2025-11-25 19:50:02.672 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:02 compute-0 nova_compute[187212]: 2025-11-25 19:50:02.673 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:02 compute-0 nova_compute[187212]: 2025-11-25 19:50:02.676 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:06 compute-0 podman[222268]: 2025-11-25 19:50:06.218228296 +0000 UTC m=+0.131020210 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 19:50:07 compute-0 nova_compute[187212]: 2025-11-25 19:50:07.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:07 compute-0 nova_compute[187212]: 2025-11-25 19:50:07.675 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:07 compute-0 nova_compute[187212]: 2025-11-25 19:50:07.685 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:07 compute-0 nova_compute[187212]: 2025-11-25 19:50:07.686 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:07 compute-0 nova_compute[187212]: 2025-11-25 19:50:07.686 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:07 compute-0 nova_compute[187212]: 2025-11-25 19:50:07.686 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:07 compute-0 nova_compute[187212]: 2025-11-25 19:50:07.686 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:50:09 compute-0 nova_compute[187212]: 2025-11-25 19:50:09.679 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:09 compute-0 nova_compute[187212]: 2025-11-25 19:50:09.679 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:50:10 compute-0 podman[222297]: 2025-11-25 19:50:10.157347481 +0000 UTC m=+0.076727447 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 19:50:10 compute-0 nova_compute[187212]: 2025-11-25 19:50:10.191 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:50:10 compute-0 nova_compute[187212]: 2025-11-25 19:50:10.417 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:10 compute-0 nova_compute[187212]: 2025-11-25 19:50:10.930 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Triggering sync for uuid f71d9429-2da3-4b6b-b82d-63027e46f952 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Nov 25 19:50:10 compute-0 nova_compute[187212]: 2025-11-25 19:50:10.931 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "f71d9429-2da3-4b6b-b82d-63027e46f952" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:50:10 compute-0 nova_compute[187212]: 2025-11-25 19:50:10.931 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:50:11 compute-0 nova_compute[187212]: 2025-11-25 19:50:11.445 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.514s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:50:12 compute-0 nova_compute[187212]: 2025-11-25 19:50:12.679 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:12 compute-0 nova_compute[187212]: 2025-11-25 19:50:12.680 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:50:12 compute-0 nova_compute[187212]: 2025-11-25 19:50:12.681 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:50:12 compute-0 nova_compute[187212]: 2025-11-25 19:50:12.681 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:12 compute-0 nova_compute[187212]: 2025-11-25 19:50:12.682 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:12 compute-0 nova_compute[187212]: 2025-11-25 19:50:12.683 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:50:15 compute-0 podman[222316]: 2025-11-25 19:50:15.171684706 +0000 UTC m=+0.084893853 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 19:50:17 compute-0 nova_compute[187212]: 2025-11-25 19:50:17.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:17 compute-0 sshd-session[222338]: Connection closed by authenticating user daemon 209.38.103.174 port 44728 [preauth]
Nov 25 19:50:17 compute-0 nova_compute[187212]: 2025-11-25 19:50:17.684 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:17 compute-0 nova_compute[187212]: 2025-11-25 19:50:17.685 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:17 compute-0 nova_compute[187212]: 2025-11-25 19:50:17.685 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:50:17 compute-0 nova_compute[187212]: 2025-11-25 19:50:17.686 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:17 compute-0 nova_compute[187212]: 2025-11-25 19:50:17.726 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:50:17 compute-0 nova_compute[187212]: 2025-11-25 19:50:17.727 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:18 compute-0 podman[222340]: 2025-11-25 19:50:18.166465327 +0000 UTC m=+0.086900976 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 25 19:50:22 compute-0 nova_compute[187212]: 2025-11-25 19:50:22.728 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:50:27 compute-0 nova_compute[187212]: 2025-11-25 19:50:27.730 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:29 compute-0 podman[222361]: 2025-11-25 19:50:29.169019046 +0000 UTC m=+0.083426134 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:50:29 compute-0 podman[197585]: time="2025-11-25T19:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:50:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:50:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3088 "" "Go-http-client/1.1"
Nov 25 19:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:50:31.174 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:50:31.174 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:50:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:50:31.175 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:50:31 compute-0 openstack_network_exporter[199731]: ERROR   19:50:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:50:31 compute-0 openstack_network_exporter[199731]: ERROR   19:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:50:31 compute-0 openstack_network_exporter[199731]: ERROR   19:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:50:31 compute-0 openstack_network_exporter[199731]: ERROR   19:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:50:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:50:31 compute-0 openstack_network_exporter[199731]: ERROR   19:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:50:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:50:32 compute-0 nova_compute[187212]: 2025-11-25 19:50:32.733 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:32 compute-0 nova_compute[187212]: 2025-11-25 19:50:32.734 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:32 compute-0 nova_compute[187212]: 2025-11-25 19:50:32.734 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:50:32 compute-0 nova_compute[187212]: 2025-11-25 19:50:32.735 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:32 compute-0 nova_compute[187212]: 2025-11-25 19:50:32.775 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:50:32 compute-0 nova_compute[187212]: 2025-11-25 19:50:32.775 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:37 compute-0 podman[222388]: 2025-11-25 19:50:37.197356387 +0000 UTC m=+0.124047867 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Nov 25 19:50:37 compute-0 nova_compute[187212]: 2025-11-25 19:50:37.776 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:50:37 compute-0 nova_compute[187212]: 2025-11-25 19:50:37.778 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:41 compute-0 podman[222416]: 2025-11-25 19:50:41.139611695 +0000 UTC m=+0.065852280 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 25 19:50:42 compute-0 nova_compute[187212]: 2025-11-25 19:50:42.778 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:46 compute-0 podman[222435]: 2025-11-25 19:50:46.164860718 +0000 UTC m=+0.097952008 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:50:47 compute-0 nova_compute[187212]: 2025-11-25 19:50:47.781 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:50:49 compute-0 podman[222457]: 2025-11-25 19:50:49.185085068 +0000 UTC m=+0.097356571 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd)
Nov 25 19:50:52 compute-0 nova_compute[187212]: 2025-11-25 19:50:52.682 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:52 compute-0 nova_compute[187212]: 2025-11-25 19:50:52.783 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:55 compute-0 nova_compute[187212]: 2025-11-25 19:50:55.169 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.691 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.784 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.786 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.786 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.786 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.787 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:50:57 compute-0 nova_compute[187212]: 2025-11-25 19:50:57.789 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:50:58 compute-0 sshd-session[222478]: Connection closed by authenticating user daemon 209.38.103.174 port 42638 [preauth]
Nov 25 19:50:58 compute-0 nova_compute[187212]: 2025-11-25 19:50:58.741 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:50:58 compute-0 nova_compute[187212]: 2025-11-25 19:50:58.838 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:50:58 compute-0 nova_compute[187212]: 2025-11-25 19:50:58.839 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:50:58 compute-0 nova_compute[187212]: 2025-11-25 19:50:58.904 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:50:59 compute-0 nova_compute[187212]: 2025-11-25 19:50:59.074 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:50:59 compute-0 nova_compute[187212]: 2025-11-25 19:50:59.075 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:50:59 compute-0 nova_compute[187212]: 2025-11-25 19:50:59.098 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:50:59 compute-0 nova_compute[187212]: 2025-11-25 19:50:59.099 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5604MB free_disk=72.96125793457031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:50:59 compute-0 nova_compute[187212]: 2025-11-25 19:50:59.100 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:50:59 compute-0 nova_compute[187212]: 2025-11-25 19:50:59.100 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:50:59 compute-0 podman[197585]: time="2025-11-25T19:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:50:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:50:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3091 "" "Go-http-client/1.1"
Nov 25 19:51:00 compute-0 podman[222487]: 2025-11-25 19:51:00.15776491 +0000 UTC m=+0.084095922 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:51:00 compute-0 nova_compute[187212]: 2025-11-25 19:51:00.726 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:51:00 compute-0 nova_compute[187212]: 2025-11-25 19:51:00.727 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:51:00 compute-0 nova_compute[187212]: 2025-11-25 19:51:00.727 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:50:59 up  1:43,  0 user,  load average: 0.02, 0.06, 0.14\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:51:00 compute-0 nova_compute[187212]: 2025-11-25 19:51:00.763 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:51:00 compute-0 nova_compute[187212]: 2025-11-25 19:51:00.816 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:51:00 compute-0 nova_compute[187212]: 2025-11-25 19:51:00.816 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:51:00 compute-0 nova_compute[187212]: 2025-11-25 19:51:00.830 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:51:00 compute-0 nova_compute[187212]: 2025-11-25 19:51:00.852 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:51:00 compute-0 nova_compute[187212]: 2025-11-25 19:51:00.891 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:51:01 compute-0 nova_compute[187212]: 2025-11-25 19:51:01.399 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:51:01 compute-0 openstack_network_exporter[199731]: ERROR   19:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:51:01 compute-0 openstack_network_exporter[199731]: ERROR   19:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:51:01 compute-0 openstack_network_exporter[199731]: ERROR   19:51:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:51:01 compute-0 openstack_network_exporter[199731]: ERROR   19:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:51:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:51:01 compute-0 openstack_network_exporter[199731]: ERROR   19:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:51:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:51:01 compute-0 nova_compute[187212]: 2025-11-25 19:51:01.911 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:51:01 compute-0 nova_compute[187212]: 2025-11-25 19:51:01.911 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.811s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:51:02 compute-0 nova_compute[187212]: 2025-11-25 19:51:02.790 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:02 compute-0 nova_compute[187212]: 2025-11-25 19:51:02.792 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:02 compute-0 nova_compute[187212]: 2025-11-25 19:51:02.792 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:51:02 compute-0 nova_compute[187212]: 2025-11-25 19:51:02.793 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:02 compute-0 nova_compute[187212]: 2025-11-25 19:51:02.829 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:02 compute-0 nova_compute[187212]: 2025-11-25 19:51:02.831 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:02 compute-0 nova_compute[187212]: 2025-11-25 19:51:02.912 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:51:05 compute-0 nova_compute[187212]: 2025-11-25 19:51:05.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:51:06 compute-0 nova_compute[187212]: 2025-11-25 19:51:06.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:51:07 compute-0 nova_compute[187212]: 2025-11-25 19:51:07.831 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:08 compute-0 nova_compute[187212]: 2025-11-25 19:51:08.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:51:08 compute-0 podman[222512]: 2025-11-25 19:51:08.22920824 +0000 UTC m=+0.141124757 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:51:12 compute-0 podman[222538]: 2025-11-25 19:51:12.151667983 +0000 UTC m=+0.069387383 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 19:51:12 compute-0 nova_compute[187212]: 2025-11-25 19:51:12.833 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:17 compute-0 podman[222557]: 2025-11-25 19:51:17.134582395 +0000 UTC m=+0.058470315 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 25 19:51:17 compute-0 nova_compute[187212]: 2025-11-25 19:51:17.836 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:17 compute-0 nova_compute[187212]: 2025-11-25 19:51:17.837 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:17 compute-0 nova_compute[187212]: 2025-11-25 19:51:17.837 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:51:17 compute-0 nova_compute[187212]: 2025-11-25 19:51:17.837 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:17 compute-0 nova_compute[187212]: 2025-11-25 19:51:17.838 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:17 compute-0 nova_compute[187212]: 2025-11-25 19:51:17.839 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:20 compute-0 podman[222579]: 2025-11-25 19:51:20.143776026 +0000 UTC m=+0.073895022 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 19:51:22 compute-0 nova_compute[187212]: 2025-11-25 19:51:22.838 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:22 compute-0 nova_compute[187212]: 2025-11-25 19:51:22.839 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:27 compute-0 nova_compute[187212]: 2025-11-25 19:51:27.840 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:29 compute-0 podman[197585]: time="2025-11-25T19:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:51:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:51:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3091 "" "Go-http-client/1.1"
Nov 25 19:51:31 compute-0 podman[222600]: 2025-11-25 19:51:31.149017769 +0000 UTC m=+0.068719425 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:51:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:51:31.176 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:51:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:51:31.177 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:51:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:51:31.177 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:51:31 compute-0 openstack_network_exporter[199731]: ERROR   19:51:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:51:31 compute-0 openstack_network_exporter[199731]: ERROR   19:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:51:31 compute-0 openstack_network_exporter[199731]: ERROR   19:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:51:31 compute-0 openstack_network_exporter[199731]: ERROR   19:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:51:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:51:31 compute-0 openstack_network_exporter[199731]: ERROR   19:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:51:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:51:32 compute-0 nova_compute[187212]: 2025-11-25 19:51:32.843 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:37 compute-0 nova_compute[187212]: 2025-11-25 19:51:37.845 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:37 compute-0 nova_compute[187212]: 2025-11-25 19:51:37.847 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:37 compute-0 nova_compute[187212]: 2025-11-25 19:51:37.847 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:51:37 compute-0 nova_compute[187212]: 2025-11-25 19:51:37.848 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:37 compute-0 nova_compute[187212]: 2025-11-25 19:51:37.883 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:37 compute-0 nova_compute[187212]: 2025-11-25 19:51:37.883 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:39 compute-0 podman[222625]: 2025-11-25 19:51:39.242613415 +0000 UTC m=+0.162100131 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:51:40 compute-0 sshd-session[222654]: Connection closed by authenticating user daemon 209.38.103.174 port 48734 [preauth]
Nov 25 19:51:42 compute-0 nova_compute[187212]: 2025-11-25 19:51:42.885 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:42 compute-0 nova_compute[187212]: 2025-11-25 19:51:42.886 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:42 compute-0 nova_compute[187212]: 2025-11-25 19:51:42.887 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:51:42 compute-0 nova_compute[187212]: 2025-11-25 19:51:42.887 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:42 compute-0 nova_compute[187212]: 2025-11-25 19:51:42.926 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:42 compute-0 nova_compute[187212]: 2025-11-25 19:51:42.926 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:43 compute-0 podman[222656]: 2025-11-25 19:51:43.136686299 +0000 UTC m=+0.061372701 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Nov 25 19:51:47 compute-0 nova_compute[187212]: 2025-11-25 19:51:47.927 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:48 compute-0 podman[222677]: 2025-11-25 19:51:48.16643483 +0000 UTC m=+0.089712370 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 19:51:51 compute-0 podman[222698]: 2025-11-25 19:51:51.174357388 +0000 UTC m=+0.094785344 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd)
Nov 25 19:51:52 compute-0 nova_compute[187212]: 2025-11-25 19:51:52.930 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:53 compute-0 nova_compute[187212]: 2025-11-25 19:51:53.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:51:56 compute-0 nova_compute[187212]: 2025-11-25 19:51:56.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.689 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.932 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.934 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.935 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.935 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.965 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:51:57 compute-0 nova_compute[187212]: 2025-11-25 19:51:57.966 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:51:58 compute-0 nova_compute[187212]: 2025-11-25 19:51:58.736 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:51:58 compute-0 nova_compute[187212]: 2025-11-25 19:51:58.826 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:51:58 compute-0 nova_compute[187212]: 2025-11-25 19:51:58.827 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:51:58 compute-0 nova_compute[187212]: 2025-11-25 19:51:58.893 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:51:59 compute-0 nova_compute[187212]: 2025-11-25 19:51:59.041 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:51:59 compute-0 nova_compute[187212]: 2025-11-25 19:51:59.042 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:51:59 compute-0 nova_compute[187212]: 2025-11-25 19:51:59.062 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:51:59 compute-0 nova_compute[187212]: 2025-11-25 19:51:59.063 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5599MB free_disk=72.96123886108398GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:51:59 compute-0 nova_compute[187212]: 2025-11-25 19:51:59.063 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:51:59 compute-0 nova_compute[187212]: 2025-11-25 19:51:59.063 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:51:59 compute-0 podman[197585]: time="2025-11-25T19:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:51:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:51:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3089 "" "Go-http-client/1.1"
Nov 25 19:52:00 compute-0 nova_compute[187212]: 2025-11-25 19:52:00.640 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:52:00 compute-0 nova_compute[187212]: 2025-11-25 19:52:00.641 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:52:00 compute-0 nova_compute[187212]: 2025-11-25 19:52:00.641 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:51:59 up  1:44,  0 user,  load average: 0.12, 0.08, 0.14\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:52:00 compute-0 nova_compute[187212]: 2025-11-25 19:52:00.678 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:52:01 compute-0 nova_compute[187212]: 2025-11-25 19:52:01.186 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:52:01 compute-0 openstack_network_exporter[199731]: ERROR   19:52:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:52:01 compute-0 openstack_network_exporter[199731]: ERROR   19:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:52:01 compute-0 openstack_network_exporter[199731]: ERROR   19:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:52:01 compute-0 openstack_network_exporter[199731]: ERROR   19:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:52:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:52:01 compute-0 openstack_network_exporter[199731]: ERROR   19:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:52:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:52:01 compute-0 nova_compute[187212]: 2025-11-25 19:52:01.699 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:52:01 compute-0 nova_compute[187212]: 2025-11-25 19:52:01.701 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.638s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:52:02 compute-0 podman[222728]: 2025-11-25 19:52:02.170956861 +0000 UTC m=+0.079272074 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:52:02 compute-0 nova_compute[187212]: 2025-11-25 19:52:02.703 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:02 compute-0 nova_compute[187212]: 2025-11-25 19:52:02.703 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:02 compute-0 nova_compute[187212]: 2025-11-25 19:52:02.703 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:52:02 compute-0 nova_compute[187212]: 2025-11-25 19:52:02.967 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:04 compute-0 nova_compute[187212]: 2025-11-25 19:52:04.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:06 compute-0 nova_compute[187212]: 2025-11-25 19:52:06.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:07 compute-0 nova_compute[187212]: 2025-11-25 19:52:07.970 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:08 compute-0 nova_compute[187212]: 2025-11-25 19:52:08.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:09 compute-0 nova_compute[187212]: 2025-11-25 19:52:09.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:10 compute-0 podman[222752]: 2025-11-25 19:52:10.230826524 +0000 UTC m=+0.135104198 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:52:12 compute-0 nova_compute[187212]: 2025-11-25 19:52:12.973 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:14 compute-0 podman[222778]: 2025-11-25 19:52:14.128513245 +0000 UTC m=+0.057531590 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 19:52:17 compute-0 nova_compute[187212]: 2025-11-25 19:52:17.977 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:17 compute-0 nova_compute[187212]: 2025-11-25 19:52:17.979 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:17 compute-0 nova_compute[187212]: 2025-11-25 19:52:17.979 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:52:17 compute-0 nova_compute[187212]: 2025-11-25 19:52:17.980 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:52:17 compute-0 nova_compute[187212]: 2025-11-25 19:52:17.996 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:17 compute-0 nova_compute[187212]: 2025-11-25 19:52:17.996 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:52:19 compute-0 podman[222796]: 2025-11-25 19:52:19.165841757 +0000 UTC m=+0.085838207 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Nov 25 19:52:22 compute-0 podman[222819]: 2025-11-25 19:52:22.146748632 +0000 UTC m=+0.074719933 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 19:52:22 compute-0 sshd-session[222817]: Connection closed by authenticating user daemon 209.38.103.174 port 44524 [preauth]
Nov 25 19:52:22 compute-0 nova_compute[187212]: 2025-11-25 19:52:22.997 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:28 compute-0 nova_compute[187212]: 2025-11-25 19:52:28.000 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:28 compute-0 nova_compute[187212]: 2025-11-25 19:52:28.002 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:28 compute-0 nova_compute[187212]: 2025-11-25 19:52:28.003 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:52:28 compute-0 nova_compute[187212]: 2025-11-25 19:52:28.003 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:52:28 compute-0 nova_compute[187212]: 2025-11-25 19:52:28.053 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:28 compute-0 nova_compute[187212]: 2025-11-25 19:52:28.054 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:52:29 compute-0 podman[197585]: time="2025-11-25T19:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:52:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:52:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3083 "" "Go-http-client/1.1"
Nov 25 19:52:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:52:31.178 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:52:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:52:31.179 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:52:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:52:31.179 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:52:31 compute-0 openstack_network_exporter[199731]: ERROR   19:52:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:52:31 compute-0 openstack_network_exporter[199731]: ERROR   19:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:52:31 compute-0 openstack_network_exporter[199731]: ERROR   19:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:52:31 compute-0 openstack_network_exporter[199731]: ERROR   19:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:52:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:52:31 compute-0 openstack_network_exporter[199731]: ERROR   19:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:52:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:52:33 compute-0 nova_compute[187212]: 2025-11-25 19:52:33.055 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:33 compute-0 podman[222840]: 2025-11-25 19:52:33.162492821 +0000 UTC m=+0.078416821 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:52:38 compute-0 nova_compute[187212]: 2025-11-25 19:52:38.057 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:38 compute-0 nova_compute[187212]: 2025-11-25 19:52:38.059 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:38 compute-0 nova_compute[187212]: 2025-11-25 19:52:38.059 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:52:38 compute-0 nova_compute[187212]: 2025-11-25 19:52:38.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:52:38 compute-0 nova_compute[187212]: 2025-11-25 19:52:38.094 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:38 compute-0 nova_compute[187212]: 2025-11-25 19:52:38.095 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:52:41 compute-0 podman[222864]: 2025-11-25 19:52:41.208208533 +0000 UTC m=+0.133137357 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 19:52:43 compute-0 nova_compute[187212]: 2025-11-25 19:52:43.095 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:43 compute-0 nova_compute[187212]: 2025-11-25 19:52:43.096 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:45 compute-0 podman[222891]: 2025-11-25 19:52:45.150254165 +0000 UTC m=+0.075615207 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:52:48 compute-0 nova_compute[187212]: 2025-11-25 19:52:48.098 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:50 compute-0 podman[222911]: 2025-11-25 19:52:50.273712829 +0000 UTC m=+0.090744856 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 25 19:52:53 compute-0 nova_compute[187212]: 2025-11-25 19:52:53.136 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:53 compute-0 nova_compute[187212]: 2025-11-25 19:52:53.137 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:53 compute-0 nova_compute[187212]: 2025-11-25 19:52:53.137 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:52:53 compute-0 nova_compute[187212]: 2025-11-25 19:52:53.138 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:52:53 compute-0 nova_compute[187212]: 2025-11-25 19:52:53.138 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:52:53 compute-0 nova_compute[187212]: 2025-11-25 19:52:53.139 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:52:53 compute-0 podman[222933]: 2025-11-25 19:52:53.167757241 +0000 UTC m=+0.093847519 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 19:52:55 compute-0 nova_compute[187212]: 2025-11-25 19:52:55.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:57 compute-0 nova_compute[187212]: 2025-11-25 19:52:57.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:58 compute-0 nova_compute[187212]: 2025-11-25 19:52:58.139 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:52:58 compute-0 nova_compute[187212]: 2025-11-25 19:52:58.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:58 compute-0 nova_compute[187212]: 2025-11-25 19:52:58.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:52:58 compute-0 nova_compute[187212]: 2025-11-25 19:52:58.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:52:58 compute-0 nova_compute[187212]: 2025-11-25 19:52:58.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:52:58 compute-0 nova_compute[187212]: 2025-11-25 19:52:58.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:52:58 compute-0 nova_compute[187212]: 2025-11-25 19:52:58.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:52:58 compute-0 nova_compute[187212]: 2025-11-25 19:52:58.691 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:52:59 compute-0 nova_compute[187212]: 2025-11-25 19:52:59.737 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:52:59 compute-0 podman[197585]: time="2025-11-25T19:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:52:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:52:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3091 "" "Go-http-client/1.1"
Nov 25 19:52:59 compute-0 nova_compute[187212]: 2025-11-25 19:52:59.793 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:52:59 compute-0 nova_compute[187212]: 2025-11-25 19:52:59.794 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:52:59 compute-0 nova_compute[187212]: 2025-11-25 19:52:59.848 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:53:00 compute-0 nova_compute[187212]: 2025-11-25 19:53:00.012 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:53:00 compute-0 nova_compute[187212]: 2025-11-25 19:53:00.014 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:53:00 compute-0 nova_compute[187212]: 2025-11-25 19:53:00.053 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:53:00 compute-0 nova_compute[187212]: 2025-11-25 19:53:00.054 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5607MB free_disk=72.96125793457031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:53:00 compute-0 nova_compute[187212]: 2025-11-25 19:53:00.054 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:53:00 compute-0 nova_compute[187212]: 2025-11-25 19:53:00.055 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:53:01 compute-0 openstack_network_exporter[199731]: ERROR   19:53:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:53:01 compute-0 openstack_network_exporter[199731]: ERROR   19:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:53:01 compute-0 openstack_network_exporter[199731]: ERROR   19:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:53:01 compute-0 openstack_network_exporter[199731]: ERROR   19:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:53:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:53:01 compute-0 openstack_network_exporter[199731]: ERROR   19:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:53:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:53:01 compute-0 nova_compute[187212]: 2025-11-25 19:53:01.623 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:53:01 compute-0 nova_compute[187212]: 2025-11-25 19:53:01.624 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:53:01 compute-0 nova_compute[187212]: 2025-11-25 19:53:01.625 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:53:00 up  1:45,  0 user,  load average: 0.04, 0.06, 0.13\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:53:01 compute-0 nova_compute[187212]: 2025-11-25 19:53:01.682 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:53:02 compute-0 nova_compute[187212]: 2025-11-25 19:53:02.191 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:53:02 compute-0 nova_compute[187212]: 2025-11-25 19:53:02.704 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:53:02 compute-0 nova_compute[187212]: 2025-11-25 19:53:02.704 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.649s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:53:03 compute-0 nova_compute[187212]: 2025-11-25 19:53:03.142 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:03 compute-0 sshd-session[222961]: Invalid user debian from 209.38.103.174 port 50824
Nov 25 19:53:03 compute-0 sshd-session[222961]: Connection closed by invalid user debian 209.38.103.174 port 50824 [preauth]
Nov 25 19:53:03 compute-0 podman[222963]: 2025-11-25 19:53:03.644884838 +0000 UTC m=+0.091408655 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 19:53:04 compute-0 nova_compute[187212]: 2025-11-25 19:53:04.705 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:53:07 compute-0 nova_compute[187212]: 2025-11-25 19:53:07.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:53:08 compute-0 nova_compute[187212]: 2025-11-25 19:53:08.144 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:08 compute-0 nova_compute[187212]: 2025-11-25 19:53:08.146 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:08 compute-0 nova_compute[187212]: 2025-11-25 19:53:08.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:53:09 compute-0 sshd-session[222989]: Received disconnect from 150.95.85.24 port 39676:11:  [preauth]
Nov 25 19:53:09 compute-0 sshd-session[222989]: Disconnected from authenticating user root 150.95.85.24 port 39676 [preauth]
Nov 25 19:53:09 compute-0 nova_compute[187212]: 2025-11-25 19:53:09.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:53:12 compute-0 podman[222991]: 2025-11-25 19:53:12.204160748 +0000 UTC m=+0.127409374 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Nov 25 19:53:13 compute-0 nova_compute[187212]: 2025-11-25 19:53:13.146 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:16 compute-0 podman[223018]: 2025-11-25 19:53:16.184876261 +0000 UTC m=+0.101612274 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 19:53:18 compute-0 nova_compute[187212]: 2025-11-25 19:53:18.148 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:18 compute-0 nova_compute[187212]: 2025-11-25 19:53:18.149 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:18 compute-0 nova_compute[187212]: 2025-11-25 19:53:18.150 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:53:18 compute-0 nova_compute[187212]: 2025-11-25 19:53:18.150 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:18 compute-0 nova_compute[187212]: 2025-11-25 19:53:18.150 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:21 compute-0 podman[223037]: 2025-11-25 19:53:21.147151769 +0000 UTC m=+0.068309634 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Nov 25 19:53:23 compute-0 nova_compute[187212]: 2025-11-25 19:53:23.151 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:24 compute-0 podman[223058]: 2025-11-25 19:53:24.173633468 +0000 UTC m=+0.093668245 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 19:53:28 compute-0 nova_compute[187212]: 2025-11-25 19:53:28.154 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:28 compute-0 nova_compute[187212]: 2025-11-25 19:53:28.154 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:28 compute-0 nova_compute[187212]: 2025-11-25 19:53:28.155 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:53:28 compute-0 nova_compute[187212]: 2025-11-25 19:53:28.155 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:28 compute-0 nova_compute[187212]: 2025-11-25 19:53:28.155 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:28 compute-0 nova_compute[187212]: 2025-11-25 19:53:28.155 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:29 compute-0 podman[197585]: time="2025-11-25T19:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:53:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:53:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3094 "" "Go-http-client/1.1"
Nov 25 19:53:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:53:31.180 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:53:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:53:31.181 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:53:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:53:31.181 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:53:31 compute-0 openstack_network_exporter[199731]: ERROR   19:53:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:53:31 compute-0 openstack_network_exporter[199731]: ERROR   19:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:53:31 compute-0 openstack_network_exporter[199731]: ERROR   19:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:53:31 compute-0 openstack_network_exporter[199731]: ERROR   19:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:53:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:53:31 compute-0 openstack_network_exporter[199731]: ERROR   19:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:53:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:53:33 compute-0 nova_compute[187212]: 2025-11-25 19:53:33.156 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:33 compute-0 nova_compute[187212]: 2025-11-25 19:53:33.158 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:33 compute-0 nova_compute[187212]: 2025-11-25 19:53:33.158 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:53:33 compute-0 nova_compute[187212]: 2025-11-25 19:53:33.158 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:33 compute-0 nova_compute[187212]: 2025-11-25 19:53:33.194 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:33 compute-0 nova_compute[187212]: 2025-11-25 19:53:33.195 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:34 compute-0 podman[223079]: 2025-11-25 19:53:34.156205917 +0000 UTC m=+0.071176620 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:53:38 compute-0 nova_compute[187212]: 2025-11-25 19:53:38.196 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:38 compute-0 nova_compute[187212]: 2025-11-25 19:53:38.198 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:38 compute-0 nova_compute[187212]: 2025-11-25 19:53:38.199 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:53:38 compute-0 nova_compute[187212]: 2025-11-25 19:53:38.199 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:38 compute-0 nova_compute[187212]: 2025-11-25 19:53:38.227 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:38 compute-0 nova_compute[187212]: 2025-11-25 19:53:38.228 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:43 compute-0 podman[223103]: 2025-11-25 19:53:43.22737631 +0000 UTC m=+0.145457051 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Nov 25 19:53:43 compute-0 nova_compute[187212]: 2025-11-25 19:53:43.229 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:43 compute-0 nova_compute[187212]: 2025-11-25 19:53:43.231 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:43 compute-0 sshd-session[223116]: Invalid user debian from 209.38.103.174 port 56522
Nov 25 19:53:43 compute-0 sshd-session[223116]: Connection closed by invalid user debian 209.38.103.174 port 56522 [preauth]
Nov 25 19:53:47 compute-0 podman[223131]: 2025-11-25 19:53:47.135853243 +0000 UTC m=+0.052322132 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Nov 25 19:53:48 compute-0 nova_compute[187212]: 2025-11-25 19:53:48.231 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:48 compute-0 nova_compute[187212]: 2025-11-25 19:53:48.233 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:48 compute-0 nova_compute[187212]: 2025-11-25 19:53:48.233 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:53:48 compute-0 nova_compute[187212]: 2025-11-25 19:53:48.233 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:48 compute-0 nova_compute[187212]: 2025-11-25 19:53:48.268 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:48 compute-0 nova_compute[187212]: 2025-11-25 19:53:48.269 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:52 compute-0 podman[223152]: 2025-11-25 19:53:52.159529113 +0000 UTC m=+0.082413547 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter)
Nov 25 19:53:53 compute-0 nova_compute[187212]: 2025-11-25 19:53:53.269 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:53 compute-0 nova_compute[187212]: 2025-11-25 19:53:53.271 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:55 compute-0 podman[223175]: 2025-11-25 19:53:55.163884726 +0000 UTC m=+0.083959278 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:53:56 compute-0 nova_compute[187212]: 2025-11-25 19:53:56.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.272 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.273 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.273 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.273 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.274 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.275 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.696 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.696 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.697 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:53:58 compute-0 nova_compute[187212]: 2025-11-25 19:53:58.697 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:53:59 compute-0 podman[197585]: time="2025-11-25T19:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:53:59 compute-0 nova_compute[187212]: 2025-11-25 19:53:59.749 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:53:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:53:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3095 "" "Go-http-client/1.1"
Nov 25 19:53:59 compute-0 nova_compute[187212]: 2025-11-25 19:53:59.837 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:53:59 compute-0 nova_compute[187212]: 2025-11-25 19:53:59.838 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:53:59 compute-0 nova_compute[187212]: 2025-11-25 19:53:59.927 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:54:00 compute-0 nova_compute[187212]: 2025-11-25 19:54:00.147 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:54:00 compute-0 nova_compute[187212]: 2025-11-25 19:54:00.149 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:54:00 compute-0 nova_compute[187212]: 2025-11-25 19:54:00.174 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:54:00 compute-0 nova_compute[187212]: 2025-11-25 19:54:00.175 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5595MB free_disk=72.96421432495117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:54:00 compute-0 nova_compute[187212]: 2025-11-25 19:54:00.175 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:54:00 compute-0 nova_compute[187212]: 2025-11-25 19:54:00.175 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:54:01 compute-0 openstack_network_exporter[199731]: ERROR   19:54:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:54:01 compute-0 openstack_network_exporter[199731]: ERROR   19:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:54:01 compute-0 openstack_network_exporter[199731]: ERROR   19:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:54:01 compute-0 openstack_network_exporter[199731]: ERROR   19:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:54:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:54:01 compute-0 openstack_network_exporter[199731]: ERROR   19:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:54:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:54:01 compute-0 nova_compute[187212]: 2025-11-25 19:54:01.764 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:54:01 compute-0 nova_compute[187212]: 2025-11-25 19:54:01.765 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:54:01 compute-0 nova_compute[187212]: 2025-11-25 19:54:01.766 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:54:00 up  1:46,  0 user,  load average: 0.24, 0.12, 0.15\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:54:01 compute-0 nova_compute[187212]: 2025-11-25 19:54:01.835 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:54:02 compute-0 nova_compute[187212]: 2025-11-25 19:54:02.352 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:54:02 compute-0 nova_compute[187212]: 2025-11-25 19:54:02.864 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:54:02 compute-0 nova_compute[187212]: 2025-11-25 19:54:02.865 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.689s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.278 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.280 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.281 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.281 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.323 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.324 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.862 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.863 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.863 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:03 compute-0 nova_compute[187212]: 2025-11-25 19:54:03.864 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:54:05 compute-0 podman[223203]: 2025-11-25 19:54:05.169621457 +0000 UTC m=+0.085262212 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:54:05 compute-0 nova_compute[187212]: 2025-11-25 19:54:05.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:08 compute-0 nova_compute[187212]: 2025-11-25 19:54:08.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:08 compute-0 nova_compute[187212]: 2025-11-25 19:54:08.374 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:08 compute-0 nova_compute[187212]: 2025-11-25 19:54:08.375 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:08 compute-0 nova_compute[187212]: 2025-11-25 19:54:08.376 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5051 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:54:08 compute-0 nova_compute[187212]: 2025-11-25 19:54:08.376 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:08 compute-0 nova_compute[187212]: 2025-11-25 19:54:08.376 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:08 compute-0 nova_compute[187212]: 2025-11-25 19:54:08.377 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:09 compute-0 nova_compute[187212]: 2025-11-25 19:54:09.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:10 compute-0 nova_compute[187212]: 2025-11-25 19:54:10.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:13 compute-0 nova_compute[187212]: 2025-11-25 19:54:13.378 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:14 compute-0 podman[223228]: 2025-11-25 19:54:14.214907331 +0000 UTC m=+0.134071921 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 25 19:54:18 compute-0 podman[223254]: 2025-11-25 19:54:18.161586464 +0000 UTC m=+0.074245611 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Nov 25 19:54:18 compute-0 nova_compute[187212]: 2025-11-25 19:54:18.380 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:23 compute-0 sshd-session[223274]: Invalid user debian from 209.38.103.174 port 41936
Nov 25 19:54:23 compute-0 podman[223276]: 2025-11-25 19:54:23.160591793 +0000 UTC m=+0.079052928 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Nov 25 19:54:23 compute-0 sshd-session[223274]: Connection closed by invalid user debian 209.38.103.174 port 41936 [preauth]
Nov 25 19:54:23 compute-0 nova_compute[187212]: 2025-11-25 19:54:23.382 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:23 compute-0 nova_compute[187212]: 2025-11-25 19:54:23.383 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:23 compute-0 nova_compute[187212]: 2025-11-25 19:54:23.384 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:54:23 compute-0 nova_compute[187212]: 2025-11-25 19:54:23.384 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:23 compute-0 nova_compute[187212]: 2025-11-25 19:54:23.384 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:23 compute-0 nova_compute[187212]: 2025-11-25 19:54:23.385 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:26 compute-0 podman[223297]: 2025-11-25 19:54:26.165249434 +0000 UTC m=+0.078065352 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Nov 25 19:54:28 compute-0 nova_compute[187212]: 2025-11-25 19:54:28.386 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:28 compute-0 nova_compute[187212]: 2025-11-25 19:54:28.388 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:28 compute-0 nova_compute[187212]: 2025-11-25 19:54:28.388 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:54:28 compute-0 nova_compute[187212]: 2025-11-25 19:54:28.388 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:28 compute-0 nova_compute[187212]: 2025-11-25 19:54:28.430 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:28 compute-0 nova_compute[187212]: 2025-11-25 19:54:28.431 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:29 compute-0 podman[197585]: time="2025-11-25T19:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:54:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:54:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3097 "" "Go-http-client/1.1"
Nov 25 19:54:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:54:31.182 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:54:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:54:31.184 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:54:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:54:31.184 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:54:31 compute-0 openstack_network_exporter[199731]: ERROR   19:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:54:31 compute-0 openstack_network_exporter[199731]: ERROR   19:54:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:54:31 compute-0 openstack_network_exporter[199731]: ERROR   19:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:54:31 compute-0 openstack_network_exporter[199731]: ERROR   19:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:54:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:54:31 compute-0 openstack_network_exporter[199731]: ERROR   19:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:54:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:54:33 compute-0 nova_compute[187212]: 2025-11-25 19:54:33.431 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:33 compute-0 nova_compute[187212]: 2025-11-25 19:54:33.432 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:36 compute-0 podman[223319]: 2025-11-25 19:54:36.165020189 +0000 UTC m=+0.083142006 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:54:38 compute-0 nova_compute[187212]: 2025-11-25 19:54:38.432 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:38 compute-0 nova_compute[187212]: 2025-11-25 19:54:38.433 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:38 compute-0 nova_compute[187212]: 2025-11-25 19:54:38.433 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:54:38 compute-0 nova_compute[187212]: 2025-11-25 19:54:38.434 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:38 compute-0 nova_compute[187212]: 2025-11-25 19:54:38.434 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:38 compute-0 nova_compute[187212]: 2025-11-25 19:54:38.435 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:43 compute-0 nova_compute[187212]: 2025-11-25 19:54:43.436 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:43 compute-0 nova_compute[187212]: 2025-11-25 19:54:43.438 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:43 compute-0 nova_compute[187212]: 2025-11-25 19:54:43.438 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:54:43 compute-0 nova_compute[187212]: 2025-11-25 19:54:43.439 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:43 compute-0 nova_compute[187212]: 2025-11-25 19:54:43.439 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:54:43 compute-0 nova_compute[187212]: 2025-11-25 19:54:43.441 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:45 compute-0 podman[223345]: 2025-11-25 19:54:45.195353366 +0000 UTC m=+0.122499055 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 25 19:54:48 compute-0 nova_compute[187212]: 2025-11-25 19:54:48.439 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:49 compute-0 podman[223372]: 2025-11-25 19:54:49.166702953 +0000 UTC m=+0.081809451 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Nov 25 19:54:53 compute-0 nova_compute[187212]: 2025-11-25 19:54:53.441 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:54 compute-0 podman[223392]: 2025-11-25 19:54:54.170607 +0000 UTC m=+0.086346291 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm)
Nov 25 19:54:57 compute-0 podman[223413]: 2025-11-25 19:54:57.159848195 +0000 UTC m=+0.077048735 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Nov 25 19:54:57 compute-0 nova_compute[187212]: 2025-11-25 19:54:57.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:58 compute-0 nova_compute[187212]: 2025-11-25 19:54:58.443 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:54:58 compute-0 nova_compute[187212]: 2025-11-25 19:54:58.445 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:54:59 compute-0 nova_compute[187212]: 2025-11-25 19:54:59.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:59 compute-0 nova_compute[187212]: 2025-11-25 19:54:59.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:54:59 compute-0 nova_compute[187212]: 2025-11-25 19:54:59.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:54:59 compute-0 nova_compute[187212]: 2025-11-25 19:54:59.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:54:59 compute-0 nova_compute[187212]: 2025-11-25 19:54:59.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:54:59 compute-0 nova_compute[187212]: 2025-11-25 19:54:59.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:54:59 compute-0 nova_compute[187212]: 2025-11-25 19:54:59.691 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:54:59 compute-0 podman[197585]: time="2025-11-25T19:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:54:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:54:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3095 "" "Go-http-client/1.1"
Nov 25 19:55:00 compute-0 nova_compute[187212]: 2025-11-25 19:55:00.743 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:55:00 compute-0 nova_compute[187212]: 2025-11-25 19:55:00.820 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:55:00 compute-0 nova_compute[187212]: 2025-11-25 19:55:00.821 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:55:00 compute-0 nova_compute[187212]: 2025-11-25 19:55:00.887 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:55:01 compute-0 nova_compute[187212]: 2025-11-25 19:55:01.069 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:55:01 compute-0 nova_compute[187212]: 2025-11-25 19:55:01.070 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:55:01 compute-0 nova_compute[187212]: 2025-11-25 19:55:01.085 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:55:01 compute-0 nova_compute[187212]: 2025-11-25 19:55:01.086 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5608MB free_disk=72.9642333984375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:55:01 compute-0 nova_compute[187212]: 2025-11-25 19:55:01.086 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:55:01 compute-0 nova_compute[187212]: 2025-11-25 19:55:01.087 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:55:01 compute-0 openstack_network_exporter[199731]: ERROR   19:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:55:01 compute-0 openstack_network_exporter[199731]: ERROR   19:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:55:01 compute-0 openstack_network_exporter[199731]: ERROR   19:55:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:55:01 compute-0 openstack_network_exporter[199731]: ERROR   19:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:55:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:55:01 compute-0 openstack_network_exporter[199731]: ERROR   19:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:55:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:55:02 compute-0 nova_compute[187212]: 2025-11-25 19:55:02.667 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:55:02 compute-0 nova_compute[187212]: 2025-11-25 19:55:02.668 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:55:02 compute-0 nova_compute[187212]: 2025-11-25 19:55:02.668 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:55:01 up  1:47,  0 user,  load average: 0.14, 0.11, 0.14\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:55:02 compute-0 nova_compute[187212]: 2025-11-25 19:55:02.788 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:55:03 compute-0 nova_compute[187212]: 2025-11-25 19:55:03.297 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:55:03 compute-0 nova_compute[187212]: 2025-11-25 19:55:03.444 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:03 compute-0 nova_compute[187212]: 2025-11-25 19:55:03.446 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:03 compute-0 nova_compute[187212]: 2025-11-25 19:55:03.447 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:55:03 compute-0 nova_compute[187212]: 2025-11-25 19:55:03.447 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:03 compute-0 nova_compute[187212]: 2025-11-25 19:55:03.493 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:03 compute-0 nova_compute[187212]: 2025-11-25 19:55:03.495 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:03 compute-0 nova_compute[187212]: 2025-11-25 19:55:03.808 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:55:03 compute-0 nova_compute[187212]: 2025-11-25 19:55:03.808 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.722s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:55:04 compute-0 sshd-session[223442]: Invalid user debian from 209.38.103.174 port 46920
Nov 25 19:55:05 compute-0 sshd-session[223442]: Connection closed by invalid user debian 209.38.103.174 port 46920 [preauth]
Nov 25 19:55:05 compute-0 nova_compute[187212]: 2025-11-25 19:55:05.806 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:05 compute-0 nova_compute[187212]: 2025-11-25 19:55:05.807 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:07 compute-0 podman[223444]: 2025-11-25 19:55:07.19188227 +0000 UTC m=+0.101783908 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:55:08 compute-0 nova_compute[187212]: 2025-11-25 19:55:08.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:08 compute-0 nova_compute[187212]: 2025-11-25 19:55:08.495 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:08 compute-0 nova_compute[187212]: 2025-11-25 19:55:08.496 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:09 compute-0 nova_compute[187212]: 2025-11-25 19:55:09.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:09 compute-0 nova_compute[187212]: 2025-11-25 19:55:09.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 19:55:10 compute-0 nova_compute[187212]: 2025-11-25 19:55:10.684 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:11 compute-0 nova_compute[187212]: 2025-11-25 19:55:11.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:13 compute-0 nova_compute[187212]: 2025-11-25 19:55:13.497 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:13 compute-0 nova_compute[187212]: 2025-11-25 19:55:13.499 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:16 compute-0 podman[223468]: 2025-11-25 19:55:16.225879173 +0000 UTC m=+0.141981780 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4)
Nov 25 19:55:18 compute-0 nova_compute[187212]: 2025-11-25 19:55:18.502 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:18 compute-0 nova_compute[187212]: 2025-11-25 19:55:18.504 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:20 compute-0 podman[223497]: 2025-11-25 19:55:20.161935199 +0000 UTC m=+0.079026747 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 25 19:55:22 compute-0 nova_compute[187212]: 2025-11-25 19:55:22.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:22 compute-0 nova_compute[187212]: 2025-11-25 19:55:22.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 19:55:22 compute-0 nova_compute[187212]: 2025-11-25 19:55:22.681 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 19:55:22 compute-0 nova_compute[187212]: 2025-11-25 19:55:22.681 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:23 compute-0 nova_compute[187212]: 2025-11-25 19:55:23.505 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:25 compute-0 podman[223515]: 2025-11-25 19:55:25.175171934 +0000 UTC m=+0.098077050 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, version=9.6, config_id=edpm, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 19:55:28 compute-0 podman[223537]: 2025-11-25 19:55:28.169121474 +0000 UTC m=+0.086424223 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 25 19:55:28 compute-0 nova_compute[187212]: 2025-11-25 19:55:28.507 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:28 compute-0 nova_compute[187212]: 2025-11-25 19:55:28.508 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:28 compute-0 nova_compute[187212]: 2025-11-25 19:55:28.509 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:55:28 compute-0 nova_compute[187212]: 2025-11-25 19:55:28.509 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:28 compute-0 nova_compute[187212]: 2025-11-25 19:55:28.509 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:28 compute-0 nova_compute[187212]: 2025-11-25 19:55:28.510 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:29 compute-0 podman[197585]: time="2025-11-25T19:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:55:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:55:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3095 "" "Go-http-client/1.1"
Nov 25 19:55:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:55:31.185 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:55:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:55:31.185 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:55:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:55:31.186 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:55:31 compute-0 openstack_network_exporter[199731]: ERROR   19:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:55:31 compute-0 openstack_network_exporter[199731]: ERROR   19:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:55:31 compute-0 openstack_network_exporter[199731]: ERROR   19:55:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:55:31 compute-0 openstack_network_exporter[199731]: ERROR   19:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:55:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:55:31 compute-0 openstack_network_exporter[199731]: ERROR   19:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:55:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:55:33 compute-0 nova_compute[187212]: 2025-11-25 19:55:33.511 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:38 compute-0 podman[223558]: 2025-11-25 19:55:38.155162197 +0000 UTC m=+0.075458503 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:55:38 compute-0 nova_compute[187212]: 2025-11-25 19:55:38.512 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:38 compute-0 nova_compute[187212]: 2025-11-25 19:55:38.513 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:38 compute-0 nova_compute[187212]: 2025-11-25 19:55:38.513 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:55:38 compute-0 nova_compute[187212]: 2025-11-25 19:55:38.513 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:38 compute-0 nova_compute[187212]: 2025-11-25 19:55:38.513 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:38 compute-0 nova_compute[187212]: 2025-11-25 19:55:38.514 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:43 compute-0 nova_compute[187212]: 2025-11-25 19:55:43.515 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:43 compute-0 nova_compute[187212]: 2025-11-25 19:55:43.517 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:43 compute-0 nova_compute[187212]: 2025-11-25 19:55:43.517 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:55:43 compute-0 nova_compute[187212]: 2025-11-25 19:55:43.517 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:43 compute-0 nova_compute[187212]: 2025-11-25 19:55:43.518 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:43 compute-0 nova_compute[187212]: 2025-11-25 19:55:43.519 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:44 compute-0 sshd-session[223582]: Invalid user debian from 209.38.103.174 port 42780
Nov 25 19:55:44 compute-0 sshd-session[223582]: Connection closed by invalid user debian 209.38.103.174 port 42780 [preauth]
Nov 25 19:55:47 compute-0 podman[223585]: 2025-11-25 19:55:47.231102678 +0000 UTC m=+0.146441388 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 25 19:55:48 compute-0 nova_compute[187212]: 2025-11-25 19:55:48.520 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:48 compute-0 nova_compute[187212]: 2025-11-25 19:55:48.521 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:48 compute-0 nova_compute[187212]: 2025-11-25 19:55:48.521 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:55:48 compute-0 nova_compute[187212]: 2025-11-25 19:55:48.521 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:48 compute-0 nova_compute[187212]: 2025-11-25 19:55:48.522 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:48 compute-0 nova_compute[187212]: 2025-11-25 19:55:48.523 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:51 compute-0 podman[223612]: 2025-11-25 19:55:51.167903542 +0000 UTC m=+0.082861559 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Nov 25 19:55:53 compute-0 nova_compute[187212]: 2025-11-25 19:55:53.525 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:53 compute-0 nova_compute[187212]: 2025-11-25 19:55:53.526 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:53 compute-0 nova_compute[187212]: 2025-11-25 19:55:53.527 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:55:53 compute-0 nova_compute[187212]: 2025-11-25 19:55:53.527 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:53 compute-0 nova_compute[187212]: 2025-11-25 19:55:53.561 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:53 compute-0 nova_compute[187212]: 2025-11-25 19:55:53.562 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:56 compute-0 podman[223631]: 2025-11-25 19:55:56.164716672 +0000 UTC m=+0.087397889 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Nov 25 19:55:58 compute-0 nova_compute[187212]: 2025-11-25 19:55:58.563 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:55:58 compute-0 nova_compute[187212]: 2025-11-25 19:55:58.565 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:58 compute-0 nova_compute[187212]: 2025-11-25 19:55:58.565 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:55:58 compute-0 nova_compute[187212]: 2025-11-25 19:55:58.565 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:58 compute-0 nova_compute[187212]: 2025-11-25 19:55:58.566 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:55:58 compute-0 nova_compute[187212]: 2025-11-25 19:55:58.567 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:55:59 compute-0 podman[223652]: 2025-11-25 19:55:59.167445211 +0000 UTC m=+0.088498587 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 19:55:59 compute-0 nova_compute[187212]: 2025-11-25 19:55:59.188 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:59 compute-0 nova_compute[187212]: 2025-11-25 19:55:59.188 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:55:59 compute-0 nova_compute[187212]: 2025-11-25 19:55:59.189 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:55:59 compute-0 podman[197585]: time="2025-11-25T19:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:55:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:55:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3095 "" "Go-http-client/1.1"
Nov 25 19:56:01 compute-0 nova_compute[187212]: 2025-11-25 19:56:01.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:56:01 compute-0 nova_compute[187212]: 2025-11-25 19:56:01.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:56:01 compute-0 anacron[202533]: Job `cron.weekly' started
Nov 25 19:56:01 compute-0 anacron[202533]: Job `cron.weekly' terminated
Nov 25 19:56:01 compute-0 openstack_network_exporter[199731]: ERROR   19:56:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:56:01 compute-0 openstack_network_exporter[199731]: ERROR   19:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:56:01 compute-0 openstack_network_exporter[199731]: ERROR   19:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:56:01 compute-0 openstack_network_exporter[199731]: ERROR   19:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:56:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:56:01 compute-0 openstack_network_exporter[199731]: ERROR   19:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:56:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:56:01 compute-0 nova_compute[187212]: 2025-11-25 19:56:01.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:56:01 compute-0 nova_compute[187212]: 2025-11-25 19:56:01.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:56:01 compute-0 nova_compute[187212]: 2025-11-25 19:56:01.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:56:01 compute-0 nova_compute[187212]: 2025-11-25 19:56:01.690 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:56:02 compute-0 nova_compute[187212]: 2025-11-25 19:56:02.740 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:56:02 compute-0 nova_compute[187212]: 2025-11-25 19:56:02.825 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:56:02 compute-0 nova_compute[187212]: 2025-11-25 19:56:02.826 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:56:02 compute-0 nova_compute[187212]: 2025-11-25 19:56:02.912 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.163 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.165 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.206 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.207 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5605MB free_disk=72.96416091918945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.208 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.208 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.568 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.570 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.570 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.570 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.610 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:03 compute-0 nova_compute[187212]: 2025-11-25 19:56:03.610 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:04 compute-0 nova_compute[187212]: 2025-11-25 19:56:04.821 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:56:04 compute-0 nova_compute[187212]: 2025-11-25 19:56:04.822 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:56:04 compute-0 nova_compute[187212]: 2025-11-25 19:56:04.823 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:56:03 up  1:48,  0 user,  load average: 0.05, 0.09, 0.13\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:56:04 compute-0 nova_compute[187212]: 2025-11-25 19:56:04.888 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 19:56:04 compute-0 nova_compute[187212]: 2025-11-25 19:56:04.911 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 19:56:04 compute-0 nova_compute[187212]: 2025-11-25 19:56:04.912 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 19:56:05 compute-0 nova_compute[187212]: 2025-11-25 19:56:05.083 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 19:56:05 compute-0 nova_compute[187212]: 2025-11-25 19:56:05.101 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 19:56:05 compute-0 nova_compute[187212]: 2025-11-25 19:56:05.148 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:56:05 compute-0 nova_compute[187212]: 2025-11-25 19:56:05.658 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:56:06 compute-0 nova_compute[187212]: 2025-11-25 19:56:06.171 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:56:06 compute-0 nova_compute[187212]: 2025-11-25 19:56:06.172 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.964s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:56:07 compute-0 nova_compute[187212]: 2025-11-25 19:56:07.168 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:56:08 compute-0 nova_compute[187212]: 2025-11-25 19:56:08.612 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:09 compute-0 podman[223684]: 2025-11-25 19:56:09.158780924 +0000 UTC m=+0.080648740 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:56:09 compute-0 nova_compute[187212]: 2025-11-25 19:56:09.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:56:10 compute-0 nova_compute[187212]: 2025-11-25 19:56:10.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:56:10 compute-0 nova_compute[187212]: 2025-11-25 19:56:10.683 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:56:13 compute-0 nova_compute[187212]: 2025-11-25 19:56:13.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:56:13 compute-0 nova_compute[187212]: 2025-11-25 19:56:13.613 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:18 compute-0 podman[223710]: 2025-11-25 19:56:18.212014246 +0000 UTC m=+0.126757238 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 25 19:56:18 compute-0 nova_compute[187212]: 2025-11-25 19:56:18.615 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:18 compute-0 nova_compute[187212]: 2025-11-25 19:56:18.617 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:18 compute-0 nova_compute[187212]: 2025-11-25 19:56:18.617 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:56:18 compute-0 nova_compute[187212]: 2025-11-25 19:56:18.617 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:18 compute-0 nova_compute[187212]: 2025-11-25 19:56:18.664 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:18 compute-0 nova_compute[187212]: 2025-11-25 19:56:18.664 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:22 compute-0 podman[223737]: 2025-11-25 19:56:22.14584151 +0000 UTC m=+0.063211450 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 19:56:23 compute-0 nova_compute[187212]: 2025-11-25 19:56:23.665 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:23 compute-0 nova_compute[187212]: 2025-11-25 19:56:23.666 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:24 compute-0 sshd-session[223757]: Invalid user debian from 209.38.103.174 port 50830
Nov 25 19:56:24 compute-0 sshd-session[223757]: Connection closed by invalid user debian 209.38.103.174 port 50830 [preauth]
Nov 25 19:56:27 compute-0 podman[223759]: 2025-11-25 19:56:27.149281246 +0000 UTC m=+0.068731266 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Nov 25 19:56:28 compute-0 nova_compute[187212]: 2025-11-25 19:56:28.668 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:29 compute-0 podman[197585]: time="2025-11-25T19:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:56:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:56:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3094 "" "Go-http-client/1.1"
Nov 25 19:56:30 compute-0 podman[223781]: 2025-11-25 19:56:30.158054276 +0000 UTC m=+0.087256945 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4)
Nov 25 19:56:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:56:31.187 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:56:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:56:31.188 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:56:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:56:31.189 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:56:31 compute-0 openstack_network_exporter[199731]: ERROR   19:56:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:56:31 compute-0 openstack_network_exporter[199731]: ERROR   19:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:56:31 compute-0 openstack_network_exporter[199731]: ERROR   19:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:56:31 compute-0 openstack_network_exporter[199731]: ERROR   19:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:56:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:56:31 compute-0 openstack_network_exporter[199731]: ERROR   19:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:56:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:56:33 compute-0 nova_compute[187212]: 2025-11-25 19:56:33.670 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:38 compute-0 nova_compute[187212]: 2025-11-25 19:56:38.671 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:40 compute-0 podman[223802]: 2025-11-25 19:56:40.154724766 +0000 UTC m=+0.075416642 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:56:43 compute-0 nova_compute[187212]: 2025-11-25 19:56:43.674 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:43 compute-0 nova_compute[187212]: 2025-11-25 19:56:43.717 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:43 compute-0 nova_compute[187212]: 2025-11-25 19:56:43.718 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:56:43 compute-0 nova_compute[187212]: 2025-11-25 19:56:43.718 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:43 compute-0 nova_compute[187212]: 2025-11-25 19:56:43.719 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:43 compute-0 nova_compute[187212]: 2025-11-25 19:56:43.720 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:48 compute-0 nova_compute[187212]: 2025-11-25 19:56:48.720 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:48 compute-0 nova_compute[187212]: 2025-11-25 19:56:48.722 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:48 compute-0 nova_compute[187212]: 2025-11-25 19:56:48.723 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:56:48 compute-0 nova_compute[187212]: 2025-11-25 19:56:48.723 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:48 compute-0 nova_compute[187212]: 2025-11-25 19:56:48.760 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:48 compute-0 nova_compute[187212]: 2025-11-25 19:56:48.760 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:49 compute-0 podman[223827]: 2025-11-25 19:56:49.183604116 +0000 UTC m=+0.104871080 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 19:56:53 compute-0 podman[223854]: 2025-11-25 19:56:53.160969861 +0000 UTC m=+0.083109086 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 19:56:53 compute-0 nova_compute[187212]: 2025-11-25 19:56:53.761 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:53 compute-0 nova_compute[187212]: 2025-11-25 19:56:53.763 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:53 compute-0 nova_compute[187212]: 2025-11-25 19:56:53.764 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:56:53 compute-0 nova_compute[187212]: 2025-11-25 19:56:53.764 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:53 compute-0 nova_compute[187212]: 2025-11-25 19:56:53.816 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:53 compute-0 nova_compute[187212]: 2025-11-25 19:56:53.817 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:58 compute-0 podman[223874]: 2025-11-25 19:56:58.180328056 +0000 UTC m=+0.096600392 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public)
Nov 25 19:56:58 compute-0 nova_compute[187212]: 2025-11-25 19:56:58.818 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:56:58 compute-0 nova_compute[187212]: 2025-11-25 19:56:58.819 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:58 compute-0 nova_compute[187212]: 2025-11-25 19:56:58.819 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:56:58 compute-0 nova_compute[187212]: 2025-11-25 19:56:58.819 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:58 compute-0 nova_compute[187212]: 2025-11-25 19:56:58.820 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:56:58 compute-0 nova_compute[187212]: 2025-11-25 19:56:58.820 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:56:59 compute-0 nova_compute[187212]: 2025-11-25 19:56:59.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:56:59 compute-0 podman[197585]: time="2025-11-25T19:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:56:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:56:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3094 "" "Go-http-client/1.1"
Nov 25 19:57:00 compute-0 nova_compute[187212]: 2025-11-25 19:57:00.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:57:00 compute-0 nova_compute[187212]: 2025-11-25 19:57:00.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:57:01 compute-0 podman[223896]: 2025-11-25 19:57:01.188969623 +0000 UTC m=+0.096640843 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:57:01 compute-0 openstack_network_exporter[199731]: ERROR   19:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:57:01 compute-0 openstack_network_exporter[199731]: ERROR   19:57:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:57:01 compute-0 openstack_network_exporter[199731]: ERROR   19:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:57:01 compute-0 openstack_network_exporter[199731]: ERROR   19:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:57:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:57:01 compute-0 openstack_network_exporter[199731]: ERROR   19:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:57:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:57:02 compute-0 nova_compute[187212]: 2025-11-25 19:57:02.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:57:02 compute-0 nova_compute[187212]: 2025-11-25 19:57:02.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:57:02 compute-0 nova_compute[187212]: 2025-11-25 19:57:02.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:57:02 compute-0 nova_compute[187212]: 2025-11-25 19:57:02.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:57:02 compute-0 nova_compute[187212]: 2025-11-25 19:57:02.693 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.742 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.822 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.823 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.824 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.824 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.833 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.834 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.854 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.856 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:03 compute-0 sshd-session[223916]: Invalid user debian from 209.38.103.174 port 46900
Nov 25 19:57:03 compute-0 nova_compute[187212]: 2025-11-25 19:57:03.926 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:57:04 compute-0 sshd-session[223916]: Connection closed by invalid user debian 209.38.103.174 port 46900 [preauth]
Nov 25 19:57:04 compute-0 nova_compute[187212]: 2025-11-25 19:57:04.179 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:57:04 compute-0 nova_compute[187212]: 2025-11-25 19:57:04.181 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:57:04 compute-0 nova_compute[187212]: 2025-11-25 19:57:04.214 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:57:04 compute-0 nova_compute[187212]: 2025-11-25 19:57:04.215 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5598MB free_disk=72.96416091918945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:57:04 compute-0 nova_compute[187212]: 2025-11-25 19:57:04.215 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:57:04 compute-0 nova_compute[187212]: 2025-11-25 19:57:04.216 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:57:05 compute-0 nova_compute[187212]: 2025-11-25 19:57:05.777 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:57:05 compute-0 nova_compute[187212]: 2025-11-25 19:57:05.778 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:57:05 compute-0 nova_compute[187212]: 2025-11-25 19:57:05.778 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:57:04 up  1:49,  0 user,  load average: 0.18, 0.12, 0.13\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:57:05 compute-0 nova_compute[187212]: 2025-11-25 19:57:05.823 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:57:06 compute-0 nova_compute[187212]: 2025-11-25 19:57:06.331 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:57:06 compute-0 nova_compute[187212]: 2025-11-25 19:57:06.846 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:57:06 compute-0 nova_compute[187212]: 2025-11-25 19:57:06.846 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.631s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:57:07 compute-0 nova_compute[187212]: 2025-11-25 19:57:07.846 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:57:07 compute-0 nova_compute[187212]: 2025-11-25 19:57:07.847 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:57:08 compute-0 nova_compute[187212]: 2025-11-25 19:57:08.857 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:08 compute-0 nova_compute[187212]: 2025-11-25 19:57:08.859 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:08 compute-0 nova_compute[187212]: 2025-11-25 19:57:08.860 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:57:08 compute-0 nova_compute[187212]: 2025-11-25 19:57:08.860 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:08 compute-0 nova_compute[187212]: 2025-11-25 19:57:08.899 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:08 compute-0 nova_compute[187212]: 2025-11-25 19:57:08.900 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:10 compute-0 nova_compute[187212]: 2025-11-25 19:57:10.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:57:11 compute-0 podman[223925]: 2025-11-25 19:57:11.15606684 +0000 UTC m=+0.074979220 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:57:11 compute-0 nova_compute[187212]: 2025-11-25 19:57:11.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:57:13 compute-0 nova_compute[187212]: 2025-11-25 19:57:13.901 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:14 compute-0 nova_compute[187212]: 2025-11-25 19:57:14.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:57:18 compute-0 nova_compute[187212]: 2025-11-25 19:57:18.904 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:18 compute-0 nova_compute[187212]: 2025-11-25 19:57:18.906 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:18 compute-0 nova_compute[187212]: 2025-11-25 19:57:18.907 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:57:18 compute-0 nova_compute[187212]: 2025-11-25 19:57:18.907 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:18 compute-0 nova_compute[187212]: 2025-11-25 19:57:18.948 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:18 compute-0 nova_compute[187212]: 2025-11-25 19:57:18.949 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:20 compute-0 podman[223950]: 2025-11-25 19:57:20.240351932 +0000 UTC m=+0.159665756 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 25 19:57:23 compute-0 nova_compute[187212]: 2025-11-25 19:57:23.950 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:23 compute-0 nova_compute[187212]: 2025-11-25 19:57:23.952 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:23 compute-0 nova_compute[187212]: 2025-11-25 19:57:23.953 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:57:23 compute-0 nova_compute[187212]: 2025-11-25 19:57:23.953 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:23 compute-0 nova_compute[187212]: 2025-11-25 19:57:23.954 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:23 compute-0 nova_compute[187212]: 2025-11-25 19:57:23.956 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:24 compute-0 podman[223977]: 2025-11-25 19:57:24.177887756 +0000 UTC m=+0.087152463 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 25 19:57:28 compute-0 nova_compute[187212]: 2025-11-25 19:57:28.957 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:28 compute-0 nova_compute[187212]: 2025-11-25 19:57:28.959 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:28 compute-0 nova_compute[187212]: 2025-11-25 19:57:28.960 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:57:28 compute-0 nova_compute[187212]: 2025-11-25 19:57:28.960 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:28 compute-0 nova_compute[187212]: 2025-11-25 19:57:28.999 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:29 compute-0 nova_compute[187212]: 2025-11-25 19:57:29.000 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:29 compute-0 podman[223996]: 2025-11-25 19:57:29.181873057 +0000 UTC m=+0.098130182 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 19:57:29 compute-0 podman[197585]: time="2025-11-25T19:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:57:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:57:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3094 "" "Go-http-client/1.1"
Nov 25 19:57:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:57:31.190 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:57:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:57:31.191 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:57:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:57:31.192 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:57:31 compute-0 openstack_network_exporter[199731]: ERROR   19:57:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:57:31 compute-0 openstack_network_exporter[199731]: ERROR   19:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:57:31 compute-0 openstack_network_exporter[199731]: ERROR   19:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:57:31 compute-0 openstack_network_exporter[199731]: ERROR   19:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:57:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:57:31 compute-0 openstack_network_exporter[199731]: ERROR   19:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:57:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:57:32 compute-0 podman[224019]: 2025-11-25 19:57:32.208506547 +0000 UTC m=+0.122190627 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Nov 25 19:57:34 compute-0 nova_compute[187212]: 2025-11-25 19:57:34.001 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:39 compute-0 nova_compute[187212]: 2025-11-25 19:57:39.002 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:39 compute-0 nova_compute[187212]: 2025-11-25 19:57:39.004 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:39 compute-0 nova_compute[187212]: 2025-11-25 19:57:39.004 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:57:39 compute-0 nova_compute[187212]: 2025-11-25 19:57:39.004 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:39 compute-0 nova_compute[187212]: 2025-11-25 19:57:39.057 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:39 compute-0 nova_compute[187212]: 2025-11-25 19:57:39.057 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:42 compute-0 podman[224041]: 2025-11-25 19:57:42.168134813 +0000 UTC m=+0.084484101 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 19:57:44 compute-0 nova_compute[187212]: 2025-11-25 19:57:44.058 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:44 compute-0 nova_compute[187212]: 2025-11-25 19:57:44.061 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:44 compute-0 sshd-session[224065]: Invalid user debian from 209.38.103.174 port 56680
Nov 25 19:57:45 compute-0 sshd-session[224065]: Connection closed by invalid user debian 209.38.103.174 port 56680 [preauth]
Nov 25 19:57:49 compute-0 nova_compute[187212]: 2025-11-25 19:57:49.061 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:49 compute-0 nova_compute[187212]: 2025-11-25 19:57:49.094 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:49 compute-0 nova_compute[187212]: 2025-11-25 19:57:49.094 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:57:49 compute-0 nova_compute[187212]: 2025-11-25 19:57:49.094 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:49 compute-0 nova_compute[187212]: 2025-11-25 19:57:49.098 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:49 compute-0 nova_compute[187212]: 2025-11-25 19:57:49.098 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:49 compute-0 nova_compute[187212]: 2025-11-25 19:57:49.100 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:51 compute-0 podman[224067]: 2025-11-25 19:57:51.223736567 +0000 UTC m=+0.141614930 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 25 19:57:54 compute-0 nova_compute[187212]: 2025-11-25 19:57:54.099 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:54 compute-0 nova_compute[187212]: 2025-11-25 19:57:54.100 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:54 compute-0 nova_compute[187212]: 2025-11-25 19:57:54.100 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:57:54 compute-0 nova_compute[187212]: 2025-11-25 19:57:54.100 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:54 compute-0 nova_compute[187212]: 2025-11-25 19:57:54.101 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:54 compute-0 nova_compute[187212]: 2025-11-25 19:57:54.103 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:55 compute-0 podman[224093]: 2025-11-25 19:57:55.176947203 +0000 UTC m=+0.090100600 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 25 19:57:59 compute-0 nova_compute[187212]: 2025-11-25 19:57:59.104 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:59 compute-0 nova_compute[187212]: 2025-11-25 19:57:59.107 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:57:59 compute-0 nova_compute[187212]: 2025-11-25 19:57:59.108 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:57:59 compute-0 nova_compute[187212]: 2025-11-25 19:57:59.108 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:59 compute-0 nova_compute[187212]: 2025-11-25 19:57:59.134 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:57:59 compute-0 nova_compute[187212]: 2025-11-25 19:57:59.135 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:57:59 compute-0 podman[197585]: time="2025-11-25T19:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:57:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:57:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3091 "" "Go-http-client/1.1"
Nov 25 19:58:00 compute-0 podman[224114]: 2025-11-25 19:58:00.138647327 +0000 UTC m=+0.068143940 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350)
Nov 25 19:58:00 compute-0 nova_compute[187212]: 2025-11-25 19:58:00.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:58:00 compute-0 nova_compute[187212]: 2025-11-25 19:58:00.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:58:01 compute-0 nova_compute[187212]: 2025-11-25 19:58:01.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:58:01 compute-0 openstack_network_exporter[199731]: ERROR   19:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:58:01 compute-0 openstack_network_exporter[199731]: ERROR   19:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:58:01 compute-0 openstack_network_exporter[199731]: ERROR   19:58:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:58:01 compute-0 openstack_network_exporter[199731]: ERROR   19:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:58:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:58:01 compute-0 openstack_network_exporter[199731]: ERROR   19:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:58:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:58:03 compute-0 podman[224136]: 2025-11-25 19:58:03.184057843 +0000 UTC m=+0.108532857 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd)
Nov 25 19:58:04 compute-0 nova_compute[187212]: 2025-11-25 19:58:04.135 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:04 compute-0 nova_compute[187212]: 2025-11-25 19:58:04.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:58:04 compute-0 nova_compute[187212]: 2025-11-25 19:58:04.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:58:04 compute-0 nova_compute[187212]: 2025-11-25 19:58:04.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:58:04 compute-0 nova_compute[187212]: 2025-11-25 19:58:04.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:58:04 compute-0 nova_compute[187212]: 2025-11-25 19:58:04.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:58:04 compute-0 nova_compute[187212]: 2025-11-25 19:58:04.688 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:58:05 compute-0 nova_compute[187212]: 2025-11-25 19:58:05.833 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:58:05 compute-0 nova_compute[187212]: 2025-11-25 19:58:05.930 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:58:05 compute-0 nova_compute[187212]: 2025-11-25 19:58:05.932 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:58:06 compute-0 nova_compute[187212]: 2025-11-25 19:58:06.019 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:58:06 compute-0 nova_compute[187212]: 2025-11-25 19:58:06.214 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:58:06 compute-0 nova_compute[187212]: 2025-11-25 19:58:06.216 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:58:06 compute-0 nova_compute[187212]: 2025-11-25 19:58:06.258 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:58:06 compute-0 nova_compute[187212]: 2025-11-25 19:58:06.259 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5593MB free_disk=72.96417999267578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:58:06 compute-0 nova_compute[187212]: 2025-11-25 19:58:06.260 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:58:06 compute-0 nova_compute[187212]: 2025-11-25 19:58:06.261 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:58:08 compute-0 nova_compute[187212]: 2025-11-25 19:58:08.182 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:58:08 compute-0 nova_compute[187212]: 2025-11-25 19:58:08.184 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:58:08 compute-0 nova_compute[187212]: 2025-11-25 19:58:08.185 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:58:06 up  1:50,  0 user,  load average: 0.06, 0.09, 0.12\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:58:08 compute-0 nova_compute[187212]: 2025-11-25 19:58:08.281 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:58:08 compute-0 nova_compute[187212]: 2025-11-25 19:58:08.789 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:58:09 compute-0 nova_compute[187212]: 2025-11-25 19:58:09.137 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:09 compute-0 nova_compute[187212]: 2025-11-25 19:58:09.140 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:09 compute-0 nova_compute[187212]: 2025-11-25 19:58:09.140 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:58:09 compute-0 nova_compute[187212]: 2025-11-25 19:58:09.140 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:09 compute-0 nova_compute[187212]: 2025-11-25 19:58:09.176 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:09 compute-0 nova_compute[187212]: 2025-11-25 19:58:09.177 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:09 compute-0 nova_compute[187212]: 2025-11-25 19:58:09.300 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:58:09 compute-0 nova_compute[187212]: 2025-11-25 19:58:09.300 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.040s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:58:11 compute-0 nova_compute[187212]: 2025-11-25 19:58:11.298 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:58:11 compute-0 nova_compute[187212]: 2025-11-25 19:58:11.299 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:58:12 compute-0 nova_compute[187212]: 2025-11-25 19:58:12.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:58:12 compute-0 nova_compute[187212]: 2025-11-25 19:58:12.684 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:58:13 compute-0 podman[224163]: 2025-11-25 19:58:13.168585385 +0000 UTC m=+0.080935608 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:58:14 compute-0 nova_compute[187212]: 2025-11-25 19:58:14.178 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:14 compute-0 nova_compute[187212]: 2025-11-25 19:58:14.179 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:14 compute-0 nova_compute[187212]: 2025-11-25 19:58:14.179 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:58:14 compute-0 nova_compute[187212]: 2025-11-25 19:58:14.180 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:14 compute-0 nova_compute[187212]: 2025-11-25 19:58:14.180 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:14 compute-0 nova_compute[187212]: 2025-11-25 19:58:14.182 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:15 compute-0 nova_compute[187212]: 2025-11-25 19:58:15.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:58:19 compute-0 nova_compute[187212]: 2025-11-25 19:58:19.181 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:22 compute-0 podman[224188]: 2025-11-25 19:58:22.212627393 +0000 UTC m=+0.130076425 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Nov 25 19:58:24 compute-0 nova_compute[187212]: 2025-11-25 19:58:24.183 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:24 compute-0 nova_compute[187212]: 2025-11-25 19:58:24.185 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:24 compute-0 nova_compute[187212]: 2025-11-25 19:58:24.186 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:58:24 compute-0 nova_compute[187212]: 2025-11-25 19:58:24.186 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:24 compute-0 nova_compute[187212]: 2025-11-25 19:58:24.225 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:24 compute-0 nova_compute[187212]: 2025-11-25 19:58:24.226 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:25 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:58:25.579 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 19:58:25 compute-0 nova_compute[187212]: 2025-11-25 19:58:25.581 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:25 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:58:25.581 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 19:58:26 compute-0 podman[224217]: 2025-11-25 19:58:26.18140268 +0000 UTC m=+0.100476423 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 19:58:26 compute-0 sshd-session[224215]: Invalid user debian from 209.38.103.174 port 50564
Nov 25 19:58:26 compute-0 sshd-session[224215]: Connection closed by invalid user debian 209.38.103.174 port 50564 [preauth]
Nov 25 19:58:29 compute-0 nova_compute[187212]: 2025-11-25 19:58:29.270 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:29 compute-0 podman[197585]: time="2025-11-25T19:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:58:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:58:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3091 "" "Go-http-client/1.1"
Nov 25 19:58:31 compute-0 podman[224237]: 2025-11-25 19:58:31.1766312 +0000 UTC m=+0.099113178 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 19:58:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:58:31.192 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:58:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:58:31.193 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:58:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:58:31.193 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:58:31 compute-0 openstack_network_exporter[199731]: ERROR   19:58:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:58:31 compute-0 openstack_network_exporter[199731]: ERROR   19:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:58:31 compute-0 openstack_network_exporter[199731]: ERROR   19:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:58:31 compute-0 openstack_network_exporter[199731]: ERROR   19:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:58:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:58:31 compute-0 openstack_network_exporter[199731]: ERROR   19:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:58:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:58:33 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:58:33.584 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 19:58:34 compute-0 podman[224261]: 2025-11-25 19:58:34.172986983 +0000 UTC m=+0.095620995 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:58:34 compute-0 nova_compute[187212]: 2025-11-25 19:58:34.272 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:39 compute-0 nova_compute[187212]: 2025-11-25 19:58:39.274 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:39 compute-0 nova_compute[187212]: 2025-11-25 19:58:39.277 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:39 compute-0 nova_compute[187212]: 2025-11-25 19:58:39.277 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:58:39 compute-0 nova_compute[187212]: 2025-11-25 19:58:39.277 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:39 compute-0 nova_compute[187212]: 2025-11-25 19:58:39.307 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:39 compute-0 nova_compute[187212]: 2025-11-25 19:58:39.308 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:44 compute-0 podman[224281]: 2025-11-25 19:58:44.170560845 +0000 UTC m=+0.086286143 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 19:58:44 compute-0 nova_compute[187212]: 2025-11-25 19:58:44.309 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:44 compute-0 nova_compute[187212]: 2025-11-25 19:58:44.312 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:49 compute-0 nova_compute[187212]: 2025-11-25 19:58:49.311 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:50 compute-0 sshd-session[224305]: Received disconnect from 203.83.238.251 port 51160:11:  [preauth]
Nov 25 19:58:50 compute-0 sshd-session[224305]: Disconnected from authenticating user root 203.83.238.251 port 51160 [preauth]
Nov 25 19:58:53 compute-0 podman[224307]: 2025-11-25 19:58:53.252022988 +0000 UTC m=+0.165515549 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 19:58:54 compute-0 nova_compute[187212]: 2025-11-25 19:58:54.316 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:54 compute-0 nova_compute[187212]: 2025-11-25 19:58:54.318 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:54 compute-0 nova_compute[187212]: 2025-11-25 19:58:54.318 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:58:54 compute-0 nova_compute[187212]: 2025-11-25 19:58:54.318 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:54 compute-0 nova_compute[187212]: 2025-11-25 19:58:54.362 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:54 compute-0 nova_compute[187212]: 2025-11-25 19:58:54.363 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:57 compute-0 podman[224333]: 2025-11-25 19:58:57.17354539 +0000 UTC m=+0.084178557 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 25 19:58:59 compute-0 nova_compute[187212]: 2025-11-25 19:58:59.364 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:59 compute-0 nova_compute[187212]: 2025-11-25 19:58:59.366 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:58:59 compute-0 nova_compute[187212]: 2025-11-25 19:58:59.366 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:58:59 compute-0 nova_compute[187212]: 2025-11-25 19:58:59.366 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:59 compute-0 nova_compute[187212]: 2025-11-25 19:58:59.399 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:58:59 compute-0 nova_compute[187212]: 2025-11-25 19:58:59.400 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:58:59 compute-0 podman[197585]: time="2025-11-25T19:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:58:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:58:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3094 "" "Go-http-client/1.1"
Nov 25 19:59:01 compute-0 openstack_network_exporter[199731]: ERROR   19:59:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:59:01 compute-0 openstack_network_exporter[199731]: ERROR   19:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:59:01 compute-0 openstack_network_exporter[199731]: ERROR   19:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:59:01 compute-0 openstack_network_exporter[199731]: ERROR   19:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:59:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:59:01 compute-0 openstack_network_exporter[199731]: ERROR   19:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:59:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:59:02 compute-0 nova_compute[187212]: 2025-11-25 19:59:02.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:02 compute-0 nova_compute[187212]: 2025-11-25 19:59:02.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:02 compute-0 nova_compute[187212]: 2025-11-25 19:59:02.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 19:59:02 compute-0 podman[224352]: 2025-11-25 19:59:02.177123321 +0000 UTC m=+0.092308322 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Nov 25 19:59:04 compute-0 nova_compute[187212]: 2025-11-25 19:59:04.432 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:04 compute-0 nova_compute[187212]: 2025-11-25 19:59:04.434 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:04 compute-0 nova_compute[187212]: 2025-11-25 19:59:04.434 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:59:04 compute-0 nova_compute[187212]: 2025-11-25 19:59:04.435 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:04 compute-0 nova_compute[187212]: 2025-11-25 19:59:04.435 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:04 compute-0 nova_compute[187212]: 2025-11-25 19:59:04.436 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:04 compute-0 ovn_controller[95465]: 2025-11-25T19:59:04Z|00180|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 25 19:59:05 compute-0 podman[224373]: 2025-11-25 19:59:05.173119023 +0000 UTC m=+0.086251072 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 19:59:05 compute-0 nova_compute[187212]: 2025-11-25 19:59:05.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:05 compute-0 nova_compute[187212]: 2025-11-25 19:59:05.735 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:59:05 compute-0 nova_compute[187212]: 2025-11-25 19:59:05.736 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:59:05 compute-0 nova_compute[187212]: 2025-11-25 19:59:05.736 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:59:05 compute-0 nova_compute[187212]: 2025-11-25 19:59:05.737 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 19:59:06 compute-0 nova_compute[187212]: 2025-11-25 19:59:06.785 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:59:06 compute-0 nova_compute[187212]: 2025-11-25 19:59:06.880 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:59:06 compute-0 nova_compute[187212]: 2025-11-25 19:59:06.882 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:59:06 compute-0 nova_compute[187212]: 2025-11-25 19:59:06.974 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:59:07 compute-0 nova_compute[187212]: 2025-11-25 19:59:07.196 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 19:59:07 compute-0 nova_compute[187212]: 2025-11-25 19:59:07.198 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 19:59:07 compute-0 nova_compute[187212]: 2025-11-25 19:59:07.231 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 19:59:07 compute-0 nova_compute[187212]: 2025-11-25 19:59:07.232 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5602MB free_disk=72.96416091918945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 19:59:07 compute-0 nova_compute[187212]: 2025-11-25 19:59:07.233 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:59:07 compute-0 nova_compute[187212]: 2025-11-25 19:59:07.233 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:59:07 compute-0 sshd-session[224400]: Invalid user debian from 209.38.103.174 port 34712
Nov 25 19:59:08 compute-0 sshd-session[224400]: Connection closed by invalid user debian 209.38.103.174 port 34712 [preauth]
Nov 25 19:59:08 compute-0 nova_compute[187212]: 2025-11-25 19:59:08.813 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 19:59:08 compute-0 nova_compute[187212]: 2025-11-25 19:59:08.814 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 19:59:08 compute-0 nova_compute[187212]: 2025-11-25 19:59:08.814 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:59:07 up  1:51,  0 user,  load average: 0.07, 0.09, 0.12\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 19:59:08 compute-0 nova_compute[187212]: 2025-11-25 19:59:08.868 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 19:59:09 compute-0 nova_compute[187212]: 2025-11-25 19:59:09.376 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 19:59:09 compute-0 nova_compute[187212]: 2025-11-25 19:59:09.436 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:09 compute-0 nova_compute[187212]: 2025-11-25 19:59:09.439 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:09 compute-0 nova_compute[187212]: 2025-11-25 19:59:09.439 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:59:09 compute-0 nova_compute[187212]: 2025-11-25 19:59:09.440 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:09 compute-0 nova_compute[187212]: 2025-11-25 19:59:09.468 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:09 compute-0 nova_compute[187212]: 2025-11-25 19:59:09.469 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:09 compute-0 nova_compute[187212]: 2025-11-25 19:59:09.888 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 19:59:09 compute-0 nova_compute[187212]: 2025-11-25 19:59:09.889 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.655s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:59:10 compute-0 nova_compute[187212]: 2025-11-25 19:59:10.889 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:10 compute-0 nova_compute[187212]: 2025-11-25 19:59:10.892 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:10 compute-0 nova_compute[187212]: 2025-11-25 19:59:10.892 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:11 compute-0 nova_compute[187212]: 2025-11-25 19:59:11.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:14 compute-0 nova_compute[187212]: 2025-11-25 19:59:14.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:14 compute-0 nova_compute[187212]: 2025-11-25 19:59:14.470 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:14 compute-0 nova_compute[187212]: 2025-11-25 19:59:14.471 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:14 compute-0 nova_compute[187212]: 2025-11-25 19:59:14.471 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:59:14 compute-0 nova_compute[187212]: 2025-11-25 19:59:14.471 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:14 compute-0 nova_compute[187212]: 2025-11-25 19:59:14.472 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:14 compute-0 nova_compute[187212]: 2025-11-25 19:59:14.473 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:15 compute-0 podman[224402]: 2025-11-25 19:59:15.170220463 +0000 UTC m=+0.089546408 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 19:59:15 compute-0 nova_compute[187212]: 2025-11-25 19:59:15.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:19 compute-0 nova_compute[187212]: 2025-11-25 19:59:19.475 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:19 compute-0 nova_compute[187212]: 2025-11-25 19:59:19.477 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:19 compute-0 nova_compute[187212]: 2025-11-25 19:59:19.477 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:59:19 compute-0 nova_compute[187212]: 2025-11-25 19:59:19.477 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:19 compute-0 nova_compute[187212]: 2025-11-25 19:59:19.478 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:19 compute-0 nova_compute[187212]: 2025-11-25 19:59:19.479 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:24 compute-0 podman[224426]: 2025-11-25 19:59:24.228021152 +0000 UTC m=+0.142960815 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 19:59:24 compute-0 nova_compute[187212]: 2025-11-25 19:59:24.481 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:24 compute-0 nova_compute[187212]: 2025-11-25 19:59:24.483 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:24 compute-0 nova_compute[187212]: 2025-11-25 19:59:24.483 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:59:24 compute-0 nova_compute[187212]: 2025-11-25 19:59:24.483 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:24 compute-0 nova_compute[187212]: 2025-11-25 19:59:24.514 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:24 compute-0 nova_compute[187212]: 2025-11-25 19:59:24.514 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:28 compute-0 podman[224453]: 2025-11-25 19:59:28.168092572 +0000 UTC m=+0.083769537 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 25 19:59:29 compute-0 nova_compute[187212]: 2025-11-25 19:59:29.515 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:29 compute-0 nova_compute[187212]: 2025-11-25 19:59:29.517 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:29 compute-0 nova_compute[187212]: 2025-11-25 19:59:29.518 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:59:29 compute-0 nova_compute[187212]: 2025-11-25 19:59:29.518 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:29 compute-0 nova_compute[187212]: 2025-11-25 19:59:29.559 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:29 compute-0 nova_compute[187212]: 2025-11-25 19:59:29.560 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:29 compute-0 podman[197585]: time="2025-11-25T19:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:59:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:59:29 compute-0 podman[197585]: @ - - [25/Nov/2025:19:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3092 "" "Go-http-client/1.1"
Nov 25 19:59:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:59:31.195 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 19:59:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:59:31.195 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 19:59:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 19:59:31.196 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 19:59:31 compute-0 openstack_network_exporter[199731]: ERROR   19:59:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 19:59:31 compute-0 openstack_network_exporter[199731]: ERROR   19:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:59:31 compute-0 openstack_network_exporter[199731]: ERROR   19:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 19:59:31 compute-0 openstack_network_exporter[199731]: ERROR   19:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 19:59:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:59:31 compute-0 openstack_network_exporter[199731]: ERROR   19:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 19:59:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 19:59:33 compute-0 podman[224473]: 2025-11-25 19:59:33.152975943 +0000 UTC m=+0.076647130 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Nov 25 19:59:34 compute-0 nova_compute[187212]: 2025-11-25 19:59:34.574 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:34 compute-0 nova_compute[187212]: 2025-11-25 19:59:34.576 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:34 compute-0 nova_compute[187212]: 2025-11-25 19:59:34.577 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5017 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:59:34 compute-0 nova_compute[187212]: 2025-11-25 19:59:34.577 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:34 compute-0 nova_compute[187212]: 2025-11-25 19:59:34.578 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:34 compute-0 nova_compute[187212]: 2025-11-25 19:59:34.579 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:36 compute-0 podman[224493]: 2025-11-25 19:59:36.176831729 +0000 UTC m=+0.087754201 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 25 19:59:39 compute-0 nova_compute[187212]: 2025-11-25 19:59:39.580 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:39 compute-0 nova_compute[187212]: 2025-11-25 19:59:39.583 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:39 compute-0 nova_compute[187212]: 2025-11-25 19:59:39.583 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:59:39 compute-0 nova_compute[187212]: 2025-11-25 19:59:39.583 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:39 compute-0 nova_compute[187212]: 2025-11-25 19:59:39.619 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:39 compute-0 nova_compute[187212]: 2025-11-25 19:59:39.619 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:40 compute-0 nova_compute[187212]: 2025-11-25 19:59:40.420 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 19:59:44 compute-0 nova_compute[187212]: 2025-11-25 19:59:44.621 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:44 compute-0 nova_compute[187212]: 2025-11-25 19:59:44.625 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:46 compute-0 podman[224515]: 2025-11-25 19:59:46.161064051 +0000 UTC m=+0.074405181 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 19:59:47 compute-0 sshd-session[224539]: Invalid user debian from 209.38.103.174 port 44690
Nov 25 19:59:47 compute-0 sshd-session[224539]: Connection closed by invalid user debian 209.38.103.174 port 44690 [preauth]
Nov 25 19:59:49 compute-0 nova_compute[187212]: 2025-11-25 19:59:49.624 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:49 compute-0 nova_compute[187212]: 2025-11-25 19:59:49.625 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:54 compute-0 nova_compute[187212]: 2025-11-25 19:59:54.627 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:55 compute-0 podman[224541]: 2025-11-25 19:59:55.220723877 +0000 UTC m=+0.141693882 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 25 19:59:59 compute-0 podman[224568]: 2025-11-25 19:59:59.163271803 +0000 UTC m=+0.085019499 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 19:59:59 compute-0 nova_compute[187212]: 2025-11-25 19:59:59.630 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:59 compute-0 nova_compute[187212]: 2025-11-25 19:59:59.631 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 19:59:59 compute-0 nova_compute[187212]: 2025-11-25 19:59:59.631 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 19:59:59 compute-0 nova_compute[187212]: 2025-11-25 19:59:59.632 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:59 compute-0 nova_compute[187212]: 2025-11-25 19:59:59.673 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 19:59:59 compute-0 nova_compute[187212]: 2025-11-25 19:59:59.673 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 19:59:59 compute-0 podman[197585]: time="2025-11-25T19:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 19:59:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 19:59:59 compute-0 podman[197585]: @ - - [25/Nov/2025:19:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3093 "" "Go-http-client/1.1"
Nov 25 20:00:01 compute-0 openstack_network_exporter[199731]: ERROR   20:00:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:00:01 compute-0 openstack_network_exporter[199731]: ERROR   20:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:00:01 compute-0 openstack_network_exporter[199731]: ERROR   20:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:00:01 compute-0 openstack_network_exporter[199731]: ERROR   20:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:00:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:00:01 compute-0 openstack_network_exporter[199731]: ERROR   20:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:00:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:00:02 compute-0 nova_compute[187212]: 2025-11-25 20:00:02.684 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:03 compute-0 nova_compute[187212]: 2025-11-25 20:00:03.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:03 compute-0 nova_compute[187212]: 2025-11-25 20:00:03.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:00:04 compute-0 podman[224588]: 2025-11-25 20:00:04.184256865 +0000 UTC m=+0.097503988 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 20:00:04 compute-0 nova_compute[187212]: 2025-11-25 20:00:04.673 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:04 compute-0 nova_compute[187212]: 2025-11-25 20:00:04.675 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:04 compute-0 nova_compute[187212]: 2025-11-25 20:00:04.675 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:00:04 compute-0 nova_compute[187212]: 2025-11-25 20:00:04.675 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:04 compute-0 nova_compute[187212]: 2025-11-25 20:00:04.676 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:04 compute-0 nova_compute[187212]: 2025-11-25 20:00:04.678 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:06 compute-0 nova_compute[187212]: 2025-11-25 20:00:06.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:06 compute-0 nova_compute[187212]: 2025-11-25 20:00:06.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:00:06 compute-0 nova_compute[187212]: 2025-11-25 20:00:06.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:00:06 compute-0 nova_compute[187212]: 2025-11-25 20:00:06.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:00:06 compute-0 nova_compute[187212]: 2025-11-25 20:00:06.691 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:00:07 compute-0 podman[224609]: 2025-11-25 20:00:07.168824347 +0000 UTC m=+0.086135499 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 25 20:00:07 compute-0 nova_compute[187212]: 2025-11-25 20:00:07.732 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:00:07 compute-0 nova_compute[187212]: 2025-11-25 20:00:07.815 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:00:07 compute-0 nova_compute[187212]: 2025-11-25 20:00:07.816 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:00:07 compute-0 nova_compute[187212]: 2025-11-25 20:00:07.878 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:00:08 compute-0 nova_compute[187212]: 2025-11-25 20:00:08.103 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:00:08 compute-0 nova_compute[187212]: 2025-11-25 20:00:08.104 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:00:08 compute-0 nova_compute[187212]: 2025-11-25 20:00:08.135 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:00:08 compute-0 nova_compute[187212]: 2025-11-25 20:00:08.136 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5592MB free_disk=72.96415710449219GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:00:08 compute-0 nova_compute[187212]: 2025-11-25 20:00:08.136 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:00:08 compute-0 nova_compute[187212]: 2025-11-25 20:00:08.137 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.678 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.680 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.680 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.681 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.708 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.708 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.709 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:00:08 up  1:52,  0 user,  load average: 0.02, 0.07, 0.10\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.727 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.728 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:09 compute-0 nova_compute[187212]: 2025-11-25 20:00:09.791 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:00:10 compute-0 nova_compute[187212]: 2025-11-25 20:00:10.299 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:00:10 compute-0 nova_compute[187212]: 2025-11-25 20:00:10.812 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:00:10 compute-0 nova_compute[187212]: 2025-11-25 20:00:10.813 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.676s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:00:12 compute-0 nova_compute[187212]: 2025-11-25 20:00:12.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:12 compute-0 nova_compute[187212]: 2025-11-25 20:00:12.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:12 compute-0 nova_compute[187212]: 2025-11-25 20:00:12.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:12 compute-0 nova_compute[187212]: 2025-11-25 20:00:12.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:12 compute-0 nova_compute[187212]: 2025-11-25 20:00:12.688 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Triggering sync for uuid f71d9429-2da3-4b6b-b82d-63027e46f952 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Nov 25 20:00:12 compute-0 nova_compute[187212]: 2025-11-25 20:00:12.689 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "f71d9429-2da3-4b6b-b82d-63027e46f952" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:00:12 compute-0 nova_compute[187212]: 2025-11-25 20:00:12.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:00:12 compute-0 nova_compute[187212]: 2025-11-25 20:00:12.691 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:12 compute-0 nova_compute[187212]: 2025-11-25 20:00:12.691 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 20:00:13 compute-0 nova_compute[187212]: 2025-11-25 20:00:13.201 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "f71d9429-2da3-4b6b-b82d-63027e46f952" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.511s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:00:14 compute-0 nova_compute[187212]: 2025-11-25 20:00:14.193 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:14 compute-0 nova_compute[187212]: 2025-11-25 20:00:14.705 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:14 compute-0 nova_compute[187212]: 2025-11-25 20:00:14.728 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:14 compute-0 nova_compute[187212]: 2025-11-25 20:00:14.730 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:14 compute-0 nova_compute[187212]: 2025-11-25 20:00:14.731 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:00:14 compute-0 nova_compute[187212]: 2025-11-25 20:00:14.731 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:14 compute-0 nova_compute[187212]: 2025-11-25 20:00:14.769 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:14 compute-0 nova_compute[187212]: 2025-11-25 20:00:14.770 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:15 compute-0 nova_compute[187212]: 2025-11-25 20:00:15.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:17 compute-0 podman[224636]: 2025-11-25 20:00:17.181063502 +0000 UTC m=+0.093774700 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 20:00:19 compute-0 nova_compute[187212]: 2025-11-25 20:00:19.770 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:24 compute-0 nova_compute[187212]: 2025-11-25 20:00:24.772 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:26 compute-0 podman[224661]: 2025-11-25 20:00:26.214210552 +0000 UTC m=+0.137653506 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 25 20:00:28 compute-0 sshd-session[224687]: Invalid user debian from 209.38.103.174 port 50986
Nov 25 20:00:28 compute-0 sshd-session[224687]: Connection closed by invalid user debian 209.38.103.174 port 50986 [preauth]
Nov 25 20:00:29 compute-0 podman[197585]: time="2025-11-25T20:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:00:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18581 "" "Go-http-client/1.1"
Nov 25 20:00:29 compute-0 podman[197585]: time="2025-11-25T20:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:00:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:00:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3096 "" "Go-http-client/1.1"
Nov 25 20:00:29 compute-0 nova_compute[187212]: 2025-11-25 20:00:29.775 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:29 compute-0 nova_compute[187212]: 2025-11-25 20:00:29.778 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:29 compute-0 nova_compute[187212]: 2025-11-25 20:00:29.778 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:00:29 compute-0 nova_compute[187212]: 2025-11-25 20:00:29.778 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:29 compute-0 nova_compute[187212]: 2025-11-25 20:00:29.832 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:29 compute-0 nova_compute[187212]: 2025-11-25 20:00:29.833 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:30 compute-0 podman[224689]: 2025-11-25 20:00:30.151745875 +0000 UTC m=+0.073554667 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Nov 25 20:00:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:00:31.197 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:00:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:00:31.199 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:00:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:00:31.199 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:00:31 compute-0 openstack_network_exporter[199731]: ERROR   20:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:00:31 compute-0 openstack_network_exporter[199731]: ERROR   20:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:00:31 compute-0 openstack_network_exporter[199731]: ERROR   20:00:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:00:31 compute-0 openstack_network_exporter[199731]: ERROR   20:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:00:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:00:31 compute-0 openstack_network_exporter[199731]: ERROR   20:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:00:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:00:34 compute-0 nova_compute[187212]: 2025-11-25 20:00:34.834 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:35 compute-0 podman[224712]: 2025-11-25 20:00:35.167272521 +0000 UTC m=+0.088700136 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, version=9.6)
Nov 25 20:00:35 compute-0 nova_compute[187212]: 2025-11-25 20:00:35.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:35 compute-0 nova_compute[187212]: 2025-11-25 20:00:35.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 20:00:36 compute-0 nova_compute[187212]: 2025-11-25 20:00:36.281 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 20:00:36 compute-0 nova_compute[187212]: 2025-11-25 20:00:36.281 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:00:38 compute-0 podman[224734]: 2025-11-25 20:00:38.20066052 +0000 UTC m=+0.118842341 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 20:00:39 compute-0 nova_compute[187212]: 2025-11-25 20:00:39.835 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:39 compute-0 nova_compute[187212]: 2025-11-25 20:00:39.837 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:39 compute-0 nova_compute[187212]: 2025-11-25 20:00:39.837 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:00:39 compute-0 nova_compute[187212]: 2025-11-25 20:00:39.837 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:39 compute-0 nova_compute[187212]: 2025-11-25 20:00:39.872 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:39 compute-0 nova_compute[187212]: 2025-11-25 20:00:39.873 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:44 compute-0 nova_compute[187212]: 2025-11-25 20:00:44.873 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:44 compute-0 nova_compute[187212]: 2025-11-25 20:00:44.874 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:44 compute-0 nova_compute[187212]: 2025-11-25 20:00:44.875 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:00:44 compute-0 nova_compute[187212]: 2025-11-25 20:00:44.875 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:44 compute-0 nova_compute[187212]: 2025-11-25 20:00:44.875 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:44 compute-0 nova_compute[187212]: 2025-11-25 20:00:44.877 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:48 compute-0 podman[224755]: 2025-11-25 20:00:48.170110452 +0000 UTC m=+0.079107734 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 20:00:49 compute-0 nova_compute[187212]: 2025-11-25 20:00:49.877 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:54 compute-0 nova_compute[187212]: 2025-11-25 20:00:54.878 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:54 compute-0 nova_compute[187212]: 2025-11-25 20:00:54.880 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:54 compute-0 nova_compute[187212]: 2025-11-25 20:00:54.880 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:00:54 compute-0 nova_compute[187212]: 2025-11-25 20:00:54.881 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:54 compute-0 nova_compute[187212]: 2025-11-25 20:00:54.918 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:54 compute-0 nova_compute[187212]: 2025-11-25 20:00:54.919 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:57 compute-0 podman[224779]: 2025-11-25 20:00:57.206454744 +0000 UTC m=+0.131816892 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 20:00:59 compute-0 podman[197585]: time="2025-11-25T20:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:00:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:00:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3097 "" "Go-http-client/1.1"
Nov 25 20:00:59 compute-0 nova_compute[187212]: 2025-11-25 20:00:59.920 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:00:59 compute-0 nova_compute[187212]: 2025-11-25 20:00:59.921 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:59 compute-0 nova_compute[187212]: 2025-11-25 20:00:59.921 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:00:59 compute-0 nova_compute[187212]: 2025-11-25 20:00:59.921 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:59 compute-0 nova_compute[187212]: 2025-11-25 20:00:59.921 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:00:59 compute-0 nova_compute[187212]: 2025-11-25 20:00:59.922 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:00:59 compute-0 nova_compute[187212]: 2025-11-25 20:00:59.923 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:01 compute-0 podman[224806]: 2025-11-25 20:01:01.161741546 +0000 UTC m=+0.082875743 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true)
Nov 25 20:01:01 compute-0 CROND[224827]: (root) CMD (run-parts /etc/cron.hourly)
Nov 25 20:01:01 compute-0 run-parts[224830]: (/etc/cron.hourly) starting 0anacron
Nov 25 20:01:01 compute-0 run-parts[224836]: (/etc/cron.hourly) finished 0anacron
Nov 25 20:01:01 compute-0 CROND[224826]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 25 20:01:01 compute-0 openstack_network_exporter[199731]: ERROR   20:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:01:01 compute-0 openstack_network_exporter[199731]: ERROR   20:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:01:01 compute-0 openstack_network_exporter[199731]: ERROR   20:01:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:01:01 compute-0 openstack_network_exporter[199731]: ERROR   20:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:01:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:01:01 compute-0 openstack_network_exporter[199731]: ERROR   20:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:01:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:01:04 compute-0 nova_compute[187212]: 2025-11-25 20:01:04.923 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:04 compute-0 nova_compute[187212]: 2025-11-25 20:01:04.926 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:05 compute-0 nova_compute[187212]: 2025-11-25 20:01:05.789 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:01:05 compute-0 nova_compute[187212]: 2025-11-25 20:01:05.790 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:01:05 compute-0 nova_compute[187212]: 2025-11-25 20:01:05.790 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:01:06 compute-0 podman[224838]: 2025-11-25 20:01:06.160288796 +0000 UTC m=+0.087789212 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Nov 25 20:01:06 compute-0 nova_compute[187212]: 2025-11-25 20:01:06.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:01:06 compute-0 nova_compute[187212]: 2025-11-25 20:01:06.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:01:06 compute-0 nova_compute[187212]: 2025-11-25 20:01:06.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:01:06 compute-0 nova_compute[187212]: 2025-11-25 20:01:06.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:01:06 compute-0 nova_compute[187212]: 2025-11-25 20:01:06.693 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:01:07 compute-0 sshd-session[224860]: Invalid user debian from 209.38.103.174 port 35310
Nov 25 20:01:07 compute-0 sshd-session[224860]: Connection closed by invalid user debian 209.38.103.174 port 35310 [preauth]
Nov 25 20:01:07 compute-0 nova_compute[187212]: 2025-11-25 20:01:07.748 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:01:07 compute-0 nova_compute[187212]: 2025-11-25 20:01:07.827 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:01:07 compute-0 nova_compute[187212]: 2025-11-25 20:01:07.828 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:01:07 compute-0 nova_compute[187212]: 2025-11-25 20:01:07.887 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:01:08 compute-0 nova_compute[187212]: 2025-11-25 20:01:08.084 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:01:08 compute-0 nova_compute[187212]: 2025-11-25 20:01:08.087 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:01:08 compute-0 nova_compute[187212]: 2025-11-25 20:01:08.129 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:01:08 compute-0 nova_compute[187212]: 2025-11-25 20:01:08.130 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5596MB free_disk=72.96417617797852GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:01:08 compute-0 nova_compute[187212]: 2025-11-25 20:01:08.130 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:01:08 compute-0 nova_compute[187212]: 2025-11-25 20:01:08.131 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:01:09 compute-0 podman[224869]: 2025-11-25 20:01:09.181924294 +0000 UTC m=+0.093846872 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.717 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.718 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.719 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:01:08 up  1:53,  0 user,  load average: 0.01, 0.05, 0.09\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.738 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.762 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.763 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.775 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.803 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.847 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:01:09 compute-0 nova_compute[187212]: 2025-11-25 20:01:09.926 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:10 compute-0 nova_compute[187212]: 2025-11-25 20:01:10.356 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:01:10 compute-0 nova_compute[187212]: 2025-11-25 20:01:10.870 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:01:10 compute-0 nova_compute[187212]: 2025-11-25 20:01:10.870 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.740s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.871 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.872 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.873 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.873 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.929 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.931 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.931 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.932 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.955 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:14 compute-0 nova_compute[187212]: 2025-11-25 20:01:14.956 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:16 compute-0 nova_compute[187212]: 2025-11-25 20:01:16.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:01:19 compute-0 podman[224890]: 2025-11-25 20:01:19.167343185 +0000 UTC m=+0.080315066 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 20:01:19 compute-0 nova_compute[187212]: 2025-11-25 20:01:19.957 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:19 compute-0 nova_compute[187212]: 2025-11-25 20:01:19.958 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:19 compute-0 nova_compute[187212]: 2025-11-25 20:01:19.958 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:01:19 compute-0 nova_compute[187212]: 2025-11-25 20:01:19.958 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:19 compute-0 nova_compute[187212]: 2025-11-25 20:01:19.959 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:19 compute-0 nova_compute[187212]: 2025-11-25 20:01:19.959 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:24 compute-0 nova_compute[187212]: 2025-11-25 20:01:24.960 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:28 compute-0 podman[224914]: 2025-11-25 20:01:28.232869897 +0000 UTC m=+0.155744980 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Nov 25 20:01:29 compute-0 podman[197585]: time="2025-11-25T20:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:01:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:01:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3100 "" "Go-http-client/1.1"
Nov 25 20:01:29 compute-0 nova_compute[187212]: 2025-11-25 20:01:29.962 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:29 compute-0 nova_compute[187212]: 2025-11-25 20:01:29.963 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:29 compute-0 nova_compute[187212]: 2025-11-25 20:01:29.963 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:01:29 compute-0 nova_compute[187212]: 2025-11-25 20:01:29.963 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:29 compute-0 nova_compute[187212]: 2025-11-25 20:01:29.964 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:29 compute-0 nova_compute[187212]: 2025-11-25 20:01:29.965 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:01:31.200 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:01:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:01:31.201 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:01:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:01:31.201 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:01:31 compute-0 openstack_network_exporter[199731]: ERROR   20:01:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:01:31 compute-0 openstack_network_exporter[199731]: ERROR   20:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:01:31 compute-0 openstack_network_exporter[199731]: ERROR   20:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:01:31 compute-0 openstack_network_exporter[199731]: ERROR   20:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:01:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:01:31 compute-0 openstack_network_exporter[199731]: ERROR   20:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:01:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:01:32 compute-0 podman[224943]: 2025-11-25 20:01:32.147469808 +0000 UTC m=+0.069668466 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Nov 25 20:01:34 compute-0 nova_compute[187212]: 2025-11-25 20:01:34.966 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:37 compute-0 podman[224962]: 2025-11-25 20:01:37.159334768 +0000 UTC m=+0.078933439 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, name=ubi9-minimal)
Nov 25 20:01:39 compute-0 nova_compute[187212]: 2025-11-25 20:01:39.968 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:40 compute-0 podman[224983]: 2025-11-25 20:01:40.109891155 +0000 UTC m=+0.099980653 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Nov 25 20:01:44 compute-0 nova_compute[187212]: 2025-11-25 20:01:44.971 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:44 compute-0 nova_compute[187212]: 2025-11-25 20:01:44.972 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:44 compute-0 nova_compute[187212]: 2025-11-25 20:01:44.972 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:01:44 compute-0 nova_compute[187212]: 2025-11-25 20:01:44.973 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:45 compute-0 nova_compute[187212]: 2025-11-25 20:01:45.005 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:45 compute-0 nova_compute[187212]: 2025-11-25 20:01:45.006 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:45 compute-0 sshd-session[225003]: Invalid user debian from 209.38.103.174 port 36954
Nov 25 20:01:45 compute-0 sshd-session[225003]: Connection closed by invalid user debian 209.38.103.174 port 36954 [preauth]
Nov 25 20:01:50 compute-0 nova_compute[187212]: 2025-11-25 20:01:50.005 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:50 compute-0 podman[225005]: 2025-11-25 20:01:50.163832143 +0000 UTC m=+0.073475626 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 20:01:55 compute-0 nova_compute[187212]: 2025-11-25 20:01:55.007 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:55 compute-0 nova_compute[187212]: 2025-11-25 20:01:55.009 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:01:55 compute-0 nova_compute[187212]: 2025-11-25 20:01:55.009 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:01:55 compute-0 nova_compute[187212]: 2025-11-25 20:01:55.009 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:55 compute-0 nova_compute[187212]: 2025-11-25 20:01:55.058 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:01:55 compute-0 nova_compute[187212]: 2025-11-25 20:01:55.061 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:01:59 compute-0 podman[225030]: 2025-11-25 20:01:59.191027447 +0000 UTC m=+0.111363883 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 25 20:01:59 compute-0 podman[197585]: time="2025-11-25T20:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:01:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:01:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3095 "" "Go-http-client/1.1"
Nov 25 20:02:00 compute-0 nova_compute[187212]: 2025-11-25 20:02:00.055 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:00 compute-0 nova_compute[187212]: 2025-11-25 20:02:00.057 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:01 compute-0 openstack_network_exporter[199731]: ERROR   20:02:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:02:01 compute-0 openstack_network_exporter[199731]: ERROR   20:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:02:01 compute-0 openstack_network_exporter[199731]: ERROR   20:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:02:01 compute-0 openstack_network_exporter[199731]: ERROR   20:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:02:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:02:01 compute-0 openstack_network_exporter[199731]: ERROR   20:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:02:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:02:03 compute-0 podman[225058]: 2025-11-25 20:02:03.168286417 +0000 UTC m=+0.083781467 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 20:02:05 compute-0 nova_compute[187212]: 2025-11-25 20:02:05.058 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:05 compute-0 nova_compute[187212]: 2025-11-25 20:02:05.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:05 compute-0 nova_compute[187212]: 2025-11-25 20:02:05.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:02:05 compute-0 nova_compute[187212]: 2025-11-25 20:02:05.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:05 compute-0 nova_compute[187212]: 2025-11-25 20:02:05.100 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:05 compute-0 nova_compute[187212]: 2025-11-25 20:02:05.101 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:05 compute-0 nova_compute[187212]: 2025-11-25 20:02:05.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:02:06 compute-0 nova_compute[187212]: 2025-11-25 20:02:06.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:02:06 compute-0 nova_compute[187212]: 2025-11-25 20:02:06.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:02:06 compute-0 nova_compute[187212]: 2025-11-25 20:02:06.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:02:06 compute-0 nova_compute[187212]: 2025-11-25 20:02:06.698 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:02:06 compute-0 nova_compute[187212]: 2025-11-25 20:02:06.698 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:02:06 compute-0 nova_compute[187212]: 2025-11-25 20:02:06.699 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:02:06 compute-0 nova_compute[187212]: 2025-11-25 20:02:06.699 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:02:07 compute-0 nova_compute[187212]: 2025-11-25 20:02:07.746 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:02:07 compute-0 nova_compute[187212]: 2025-11-25 20:02:07.832 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:02:07 compute-0 nova_compute[187212]: 2025-11-25 20:02:07.834 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:02:07 compute-0 nova_compute[187212]: 2025-11-25 20:02:07.896 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:02:08 compute-0 nova_compute[187212]: 2025-11-25 20:02:08.138 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:02:08 compute-0 nova_compute[187212]: 2025-11-25 20:02:08.141 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:02:08 compute-0 nova_compute[187212]: 2025-11-25 20:02:08.165 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:02:08 compute-0 nova_compute[187212]: 2025-11-25 20:02:08.166 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5598MB free_disk=72.96415710449219GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:02:08 compute-0 nova_compute[187212]: 2025-11-25 20:02:08.167 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:02:08 compute-0 nova_compute[187212]: 2025-11-25 20:02:08.168 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:02:08 compute-0 podman[225083]: 2025-11-25 20:02:08.185335853 +0000 UTC m=+0.100157238 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 25 20:02:09 compute-0 nova_compute[187212]: 2025-11-25 20:02:09.740 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:02:09 compute-0 nova_compute[187212]: 2025-11-25 20:02:09.741 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:02:09 compute-0 nova_compute[187212]: 2025-11-25 20:02:09.741 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:02:08 up  1:54,  0 user,  load average: 0.00, 0.04, 0.09\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:02:09 compute-0 nova_compute[187212]: 2025-11-25 20:02:09.793 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:02:10 compute-0 nova_compute[187212]: 2025-11-25 20:02:10.102 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:10 compute-0 nova_compute[187212]: 2025-11-25 20:02:10.104 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:10 compute-0 nova_compute[187212]: 2025-11-25 20:02:10.104 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:02:10 compute-0 nova_compute[187212]: 2025-11-25 20:02:10.104 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:10 compute-0 nova_compute[187212]: 2025-11-25 20:02:10.105 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:10 compute-0 nova_compute[187212]: 2025-11-25 20:02:10.107 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:10 compute-0 nova_compute[187212]: 2025-11-25 20:02:10.301 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:02:10 compute-0 nova_compute[187212]: 2025-11-25 20:02:10.812 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:02:10 compute-0 nova_compute[187212]: 2025-11-25 20:02:10.813 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.645s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:02:11 compute-0 podman[225106]: 2025-11-25 20:02:11.174836076 +0000 UTC m=+0.091352656 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Nov 25 20:02:14 compute-0 nova_compute[187212]: 2025-11-25 20:02:14.814 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:02:14 compute-0 nova_compute[187212]: 2025-11-25 20:02:14.815 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:02:15 compute-0 nova_compute[187212]: 2025-11-25 20:02:15.108 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:15 compute-0 nova_compute[187212]: 2025-11-25 20:02:15.110 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:15 compute-0 nova_compute[187212]: 2025-11-25 20:02:15.111 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:02:15 compute-0 nova_compute[187212]: 2025-11-25 20:02:15.111 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:15 compute-0 nova_compute[187212]: 2025-11-25 20:02:15.156 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:15 compute-0 nova_compute[187212]: 2025-11-25 20:02:15.158 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:15 compute-0 nova_compute[187212]: 2025-11-25 20:02:15.326 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:02:15 compute-0 nova_compute[187212]: 2025-11-25 20:02:15.327 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:02:15 compute-0 nova_compute[187212]: 2025-11-25 20:02:15.327 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:02:16 compute-0 nova_compute[187212]: 2025-11-25 20:02:16.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:02:20 compute-0 nova_compute[187212]: 2025-11-25 20:02:20.159 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:20 compute-0 nova_compute[187212]: 2025-11-25 20:02:20.160 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:20 compute-0 nova_compute[187212]: 2025-11-25 20:02:20.161 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:02:20 compute-0 nova_compute[187212]: 2025-11-25 20:02:20.161 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:20 compute-0 nova_compute[187212]: 2025-11-25 20:02:20.198 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:20 compute-0 nova_compute[187212]: 2025-11-25 20:02:20.198 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:21 compute-0 podman[225126]: 2025-11-25 20:02:21.143817407 +0000 UTC m=+0.063456672 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 20:02:23 compute-0 sshd-session[225151]: Invalid user debian from 209.38.103.174 port 34178
Nov 25 20:02:23 compute-0 sshd-session[225151]: Connection closed by invalid user debian 209.38.103.174 port 34178 [preauth]
Nov 25 20:02:25 compute-0 nova_compute[187212]: 2025-11-25 20:02:25.199 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:25 compute-0 nova_compute[187212]: 2025-11-25 20:02:25.200 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:25 compute-0 nova_compute[187212]: 2025-11-25 20:02:25.200 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:02:25 compute-0 nova_compute[187212]: 2025-11-25 20:02:25.200 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:25 compute-0 nova_compute[187212]: 2025-11-25 20:02:25.201 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:25 compute-0 nova_compute[187212]: 2025-11-25 20:02:25.202 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:29 compute-0 podman[197585]: time="2025-11-25T20:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:02:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:02:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3092 "" "Go-http-client/1.1"
Nov 25 20:02:30 compute-0 nova_compute[187212]: 2025-11-25 20:02:30.205 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:30 compute-0 nova_compute[187212]: 2025-11-25 20:02:30.207 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:30 compute-0 nova_compute[187212]: 2025-11-25 20:02:30.207 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:02:30 compute-0 nova_compute[187212]: 2025-11-25 20:02:30.207 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:30 compute-0 nova_compute[187212]: 2025-11-25 20:02:30.238 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:30 compute-0 nova_compute[187212]: 2025-11-25 20:02:30.239 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:30 compute-0 podman[225153]: 2025-11-25 20:02:30.262695133 +0000 UTC m=+0.184775596 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 20:02:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:02:31.202 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:02:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:02:31.203 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:02:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:02:31.204 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:02:31 compute-0 openstack_network_exporter[199731]: ERROR   20:02:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:02:31 compute-0 openstack_network_exporter[199731]: ERROR   20:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:02:31 compute-0 openstack_network_exporter[199731]: ERROR   20:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:02:31 compute-0 openstack_network_exporter[199731]: ERROR   20:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:02:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:02:31 compute-0 openstack_network_exporter[199731]: ERROR   20:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:02:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:02:34 compute-0 podman[225180]: 2025-11-25 20:02:34.170249208 +0000 UTC m=+0.078531189 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Nov 25 20:02:35 compute-0 nova_compute[187212]: 2025-11-25 20:02:35.240 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:39 compute-0 podman[225199]: 2025-11-25 20:02:39.169826716 +0000 UTC m=+0.080880661 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 25 20:02:40 compute-0 nova_compute[187212]: 2025-11-25 20:02:40.243 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:40 compute-0 nova_compute[187212]: 2025-11-25 20:02:40.245 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:40 compute-0 nova_compute[187212]: 2025-11-25 20:02:40.245 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:02:40 compute-0 nova_compute[187212]: 2025-11-25 20:02:40.245 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:40 compute-0 nova_compute[187212]: 2025-11-25 20:02:40.278 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:40 compute-0 nova_compute[187212]: 2025-11-25 20:02:40.279 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:42 compute-0 podman[225220]: 2025-11-25 20:02:42.203929112 +0000 UTC m=+0.117150975 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 20:02:45 compute-0 nova_compute[187212]: 2025-11-25 20:02:45.280 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:50 compute-0 nova_compute[187212]: 2025-11-25 20:02:50.282 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:50 compute-0 nova_compute[187212]: 2025-11-25 20:02:50.284 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:02:50 compute-0 nova_compute[187212]: 2025-11-25 20:02:50.284 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:02:50 compute-0 nova_compute[187212]: 2025-11-25 20:02:50.284 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:50 compute-0 nova_compute[187212]: 2025-11-25 20:02:50.324 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:50 compute-0 nova_compute[187212]: 2025-11-25 20:02:50.325 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:02:52 compute-0 podman[225242]: 2025-11-25 20:02:52.174236546 +0000 UTC m=+0.084151857 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 20:02:55 compute-0 nova_compute[187212]: 2025-11-25 20:02:55.326 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:02:59 compute-0 podman[197585]: time="2025-11-25T20:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:02:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:02:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3094 "" "Go-http-client/1.1"
Nov 25 20:03:00 compute-0 nova_compute[187212]: 2025-11-25 20:03:00.327 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:03:00 compute-0 nova_compute[187212]: 2025-11-25 20:03:00.329 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:03:00 compute-0 nova_compute[187212]: 2025-11-25 20:03:00.329 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:03:00 compute-0 nova_compute[187212]: 2025-11-25 20:03:00.330 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:00 compute-0 nova_compute[187212]: 2025-11-25 20:03:00.378 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:00 compute-0 nova_compute[187212]: 2025-11-25 20:03:00.379 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:01 compute-0 podman[225267]: 2025-11-25 20:03:01.193490539 +0000 UTC m=+0.123400960 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest)
Nov 25 20:03:01 compute-0 openstack_network_exporter[199731]: ERROR   20:03:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:03:01 compute-0 openstack_network_exporter[199731]: ERROR   20:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:03:01 compute-0 openstack_network_exporter[199731]: ERROR   20:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:03:01 compute-0 openstack_network_exporter[199731]: ERROR   20:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:03:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:03:01 compute-0 openstack_network_exporter[199731]: ERROR   20:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:03:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:03:01 compute-0 sshd-session[225293]: Invalid user debian from 209.38.103.174 port 35376
Nov 25 20:03:01 compute-0 sshd-session[225293]: Connection closed by invalid user debian 209.38.103.174 port 35376 [preauth]
Nov 25 20:03:05 compute-0 nova_compute[187212]: 2025-11-25 20:03:05.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:03:05 compute-0 podman[225295]: 2025-11-25 20:03:05.174099866 +0000 UTC m=+0.090292958 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 25 20:03:05 compute-0 nova_compute[187212]: 2025-11-25 20:03:05.379 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:05 compute-0 nova_compute[187212]: 2025-11-25 20:03:05.380 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:07 compute-0 nova_compute[187212]: 2025-11-25 20:03:07.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:03:07 compute-0 nova_compute[187212]: 2025-11-25 20:03:07.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:03:07 compute-0 nova_compute[187212]: 2025-11-25 20:03:07.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:03:07 compute-0 nova_compute[187212]: 2025-11-25 20:03:07.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:03:07 compute-0 nova_compute[187212]: 2025-11-25 20:03:07.692 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:03:08 compute-0 nova_compute[187212]: 2025-11-25 20:03:08.769 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:03:08 compute-0 nova_compute[187212]: 2025-11-25 20:03:08.864 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:03:08 compute-0 nova_compute[187212]: 2025-11-25 20:03:08.865 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:03:08 compute-0 nova_compute[187212]: 2025-11-25 20:03:08.947 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:03:09 compute-0 nova_compute[187212]: 2025-11-25 20:03:09.194 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:03:09 compute-0 nova_compute[187212]: 2025-11-25 20:03:09.196 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:03:09 compute-0 nova_compute[187212]: 2025-11-25 20:03:09.240 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:03:09 compute-0 nova_compute[187212]: 2025-11-25 20:03:09.241 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5592MB free_disk=72.96417617797852GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:03:09 compute-0 nova_compute[187212]: 2025-11-25 20:03:09.241 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:03:09 compute-0 nova_compute[187212]: 2025-11-25 20:03:09.241 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:03:10 compute-0 podman[225321]: 2025-11-25 20:03:10.178549062 +0000 UTC m=+0.091298455 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.382 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.384 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.384 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.384 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.385 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.387 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.824 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.825 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.825 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:03:09 up  1:55,  0 user,  load average: 0.00, 0.03, 0.08\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:03:10 compute-0 nova_compute[187212]: 2025-11-25 20:03:10.924 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:03:11 compute-0 nova_compute[187212]: 2025-11-25 20:03:11.433 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:03:11 compute-0 nova_compute[187212]: 2025-11-25 20:03:11.945 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:03:11 compute-0 nova_compute[187212]: 2025-11-25 20:03:11.946 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.704s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:03:12 compute-0 nova_compute[187212]: 2025-11-25 20:03:12.946 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:03:12 compute-0 nova_compute[187212]: 2025-11-25 20:03:12.948 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:03:12 compute-0 nova_compute[187212]: 2025-11-25 20:03:12.948 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:03:12 compute-0 nova_compute[187212]: 2025-11-25 20:03:12.948 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:03:13 compute-0 podman[225341]: 2025-11-25 20:03:13.192211269 +0000 UTC m=+0.112121593 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 25 20:03:15 compute-0 nova_compute[187212]: 2025-11-25 20:03:15.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:03:15 compute-0 nova_compute[187212]: 2025-11-25 20:03:15.386 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:16 compute-0 nova_compute[187212]: 2025-11-25 20:03:16.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:03:16 compute-0 nova_compute[187212]: 2025-11-25 20:03:16.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:03:20 compute-0 nova_compute[187212]: 2025-11-25 20:03:20.388 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:03:20 compute-0 nova_compute[187212]: 2025-11-25 20:03:20.390 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:03:20 compute-0 nova_compute[187212]: 2025-11-25 20:03:20.391 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:03:20 compute-0 nova_compute[187212]: 2025-11-25 20:03:20.391 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:20 compute-0 nova_compute[187212]: 2025-11-25 20:03:20.425 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:20 compute-0 nova_compute[187212]: 2025-11-25 20:03:20.426 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:23 compute-0 podman[225362]: 2025-11-25 20:03:23.180357653 +0000 UTC m=+0.080297475 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 20:03:25 compute-0 nova_compute[187212]: 2025-11-25 20:03:25.427 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:25 compute-0 nova_compute[187212]: 2025-11-25 20:03:25.428 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:29 compute-0 podman[197585]: time="2025-11-25T20:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:03:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:03:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3095 "" "Go-http-client/1.1"
Nov 25 20:03:30 compute-0 nova_compute[187212]: 2025-11-25 20:03:30.430 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:03:30 compute-0 nova_compute[187212]: 2025-11-25 20:03:30.431 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:03:30 compute-0 nova_compute[187212]: 2025-11-25 20:03:30.432 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:03:30 compute-0 nova_compute[187212]: 2025-11-25 20:03:30.432 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:30 compute-0 nova_compute[187212]: 2025-11-25 20:03:30.459 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:30 compute-0 nova_compute[187212]: 2025-11-25 20:03:30.460 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:30 compute-0 nova_compute[187212]: 2025-11-25 20:03:30.463 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:03:31.205 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:03:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:03:31.206 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:03:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:03:31.207 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:03:31 compute-0 openstack_network_exporter[199731]: ERROR   20:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:03:31 compute-0 openstack_network_exporter[199731]: ERROR   20:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:03:31 compute-0 openstack_network_exporter[199731]: ERROR   20:03:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:03:31 compute-0 openstack_network_exporter[199731]: ERROR   20:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:03:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:03:31 compute-0 openstack_network_exporter[199731]: ERROR   20:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:03:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:03:32 compute-0 podman[225390]: 2025-11-25 20:03:32.230911031 +0000 UTC m=+0.153931124 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 20:03:35 compute-0 nova_compute[187212]: 2025-11-25 20:03:35.461 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:35 compute-0 nova_compute[187212]: 2025-11-25 20:03:35.465 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:36 compute-0 podman[225418]: 2025-11-25 20:03:36.159346625 +0000 UTC m=+0.088046020 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 25 20:03:40 compute-0 nova_compute[187212]: 2025-11-25 20:03:40.463 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:40 compute-0 nova_compute[187212]: 2025-11-25 20:03:40.466 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:41 compute-0 sshd-session[225437]: Invalid user debian from 209.38.103.174 port 35260
Nov 25 20:03:41 compute-0 sshd-session[225437]: Connection closed by invalid user debian 209.38.103.174 port 35260 [preauth]
Nov 25 20:03:41 compute-0 podman[225439]: 2025-11-25 20:03:41.15127471 +0000 UTC m=+0.103664911 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, version=9.6, config_id=edpm)
Nov 25 20:03:44 compute-0 podman[225461]: 2025-11-25 20:03:44.177073128 +0000 UTC m=+0.090477422 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 20:03:45 compute-0 nova_compute[187212]: 2025-11-25 20:03:45.467 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:03:45 compute-0 nova_compute[187212]: 2025-11-25 20:03:45.469 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:03:45 compute-0 nova_compute[187212]: 2025-11-25 20:03:45.470 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:03:45 compute-0 nova_compute[187212]: 2025-11-25 20:03:45.470 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:45 compute-0 nova_compute[187212]: 2025-11-25 20:03:45.499 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:45 compute-0 nova_compute[187212]: 2025-11-25 20:03:45.500 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:03:50 compute-0 nova_compute[187212]: 2025-11-25 20:03:50.634 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:54 compute-0 podman[225482]: 2025-11-25 20:03:54.145275657 +0000 UTC m=+0.064990002 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 20:03:55 compute-0 nova_compute[187212]: 2025-11-25 20:03:55.535 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:55 compute-0 nova_compute[187212]: 2025-11-25 20:03:55.678 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:03:59 compute-0 podman[197585]: time="2025-11-25T20:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:03:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:03:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3096 "" "Go-http-client/1.1"
Nov 25 20:04:00 compute-0 nova_compute[187212]: 2025-11-25 20:04:00.538 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:00 compute-0 nova_compute[187212]: 2025-11-25 20:04:00.724 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:01 compute-0 openstack_network_exporter[199731]: ERROR   20:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:04:01 compute-0 openstack_network_exporter[199731]: ERROR   20:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:04:01 compute-0 openstack_network_exporter[199731]: ERROR   20:04:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:04:01 compute-0 openstack_network_exporter[199731]: ERROR   20:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:04:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:04:01 compute-0 openstack_network_exporter[199731]: ERROR   20:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:04:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:04:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:04:03.053 104356 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'ae:30:bb', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:7b:66:29:af:01'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Nov 25 20:04:03 compute-0 nova_compute[187212]: 2025-11-25 20:04:03.053 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:03 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:04:03.054 104356 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Nov 25 20:04:03 compute-0 podman[225508]: 2025-11-25 20:04:03.256555623 +0000 UTC m=+0.176362515 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 25 20:04:05 compute-0 nova_compute[187212]: 2025-11-25 20:04:05.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:04:05 compute-0 nova_compute[187212]: 2025-11-25 20:04:05.540 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:05 compute-0 nova_compute[187212]: 2025-11-25 20:04:05.727 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:07 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:04:07.057 104356 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=942ca545-427a-4223-ba58-570f588d0469, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 25 20:04:07 compute-0 podman[225535]: 2025-11-25 20:04:07.174028268 +0000 UTC m=+0.084475575 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 25 20:04:09 compute-0 nova_compute[187212]: 2025-11-25 20:04:09.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:04:09 compute-0 nova_compute[187212]: 2025-11-25 20:04:09.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:04:09 compute-0 nova_compute[187212]: 2025-11-25 20:04:09.692 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:04:09 compute-0 nova_compute[187212]: 2025-11-25 20:04:09.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:04:09 compute-0 nova_compute[187212]: 2025-11-25 20:04:09.693 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:04:10 compute-0 nova_compute[187212]: 2025-11-25 20:04:10.542 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:10 compute-0 nova_compute[187212]: 2025-11-25 20:04:10.758 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:10 compute-0 nova_compute[187212]: 2025-11-25 20:04:10.759 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:04:10 compute-0 nova_compute[187212]: 2025-11-25 20:04:10.846 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:04:10 compute-0 nova_compute[187212]: 2025-11-25 20:04:10.847 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:04:10 compute-0 nova_compute[187212]: 2025-11-25 20:04:10.913 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:04:11 compute-0 nova_compute[187212]: 2025-11-25 20:04:11.145 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:04:11 compute-0 nova_compute[187212]: 2025-11-25 20:04:11.148 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:04:11 compute-0 nova_compute[187212]: 2025-11-25 20:04:11.180 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:04:11 compute-0 nova_compute[187212]: 2025-11-25 20:04:11.181 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5592MB free_disk=72.96322631835938GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:04:11 compute-0 nova_compute[187212]: 2025-11-25 20:04:11.182 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:04:11 compute-0 nova_compute[187212]: 2025-11-25 20:04:11.182 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:04:12 compute-0 podman[225561]: 2025-11-25 20:04:12.189616256 +0000 UTC m=+0.111288502 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Nov 25 20:04:12 compute-0 nova_compute[187212]: 2025-11-25 20:04:12.749 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:04:12 compute-0 nova_compute[187212]: 2025-11-25 20:04:12.750 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:04:12 compute-0 nova_compute[187212]: 2025-11-25 20:04:12.750 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:04:11 up  1:56,  0 user,  load average: 0.00, 0.02, 0.07\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:04:12 compute-0 nova_compute[187212]: 2025-11-25 20:04:12.809 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:04:13 compute-0 nova_compute[187212]: 2025-11-25 20:04:13.317 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:04:13 compute-0 nova_compute[187212]: 2025-11-25 20:04:13.836 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:04:13 compute-0 nova_compute[187212]: 2025-11-25 20:04:13.837 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.655s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:04:14 compute-0 nova_compute[187212]: 2025-11-25 20:04:14.838 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:04:14 compute-0 nova_compute[187212]: 2025-11-25 20:04:14.840 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:04:14 compute-0 nova_compute[187212]: 2025-11-25 20:04:14.841 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:04:14 compute-0 nova_compute[187212]: 2025-11-25 20:04:14.841 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:04:15 compute-0 nova_compute[187212]: 2025-11-25 20:04:15.170 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:04:15 compute-0 podman[225583]: 2025-11-25 20:04:15.177751432 +0000 UTC m=+0.089886468 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 25 20:04:15 compute-0 nova_compute[187212]: 2025-11-25 20:04:15.545 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:15 compute-0 nova_compute[187212]: 2025-11-25 20:04:15.763 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:16 compute-0 nova_compute[187212]: 2025-11-25 20:04:16.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:04:17 compute-0 nova_compute[187212]: 2025-11-25 20:04:17.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:04:18 compute-0 nova_compute[187212]: 2025-11-25 20:04:18.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:04:20 compute-0 nova_compute[187212]: 2025-11-25 20:04:20.547 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:20 compute-0 nova_compute[187212]: 2025-11-25 20:04:20.802 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:21 compute-0 sshd-session[225604]: Invalid user dev from 209.38.103.174 port 56514
Nov 25 20:04:21 compute-0 sshd-session[225604]: Connection closed by invalid user dev 209.38.103.174 port 56514 [preauth]
Nov 25 20:04:25 compute-0 podman[225606]: 2025-11-25 20:04:25.156971371 +0000 UTC m=+0.077003269 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 20:04:25 compute-0 nova_compute[187212]: 2025-11-25 20:04:25.550 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:25 compute-0 nova_compute[187212]: 2025-11-25 20:04:25.833 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:29 compute-0 podman[197585]: time="2025-11-25T20:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:04:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:04:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3098 "" "Go-http-client/1.1"
Nov 25 20:04:30 compute-0 nova_compute[187212]: 2025-11-25 20:04:30.553 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:30 compute-0 nova_compute[187212]: 2025-11-25 20:04:30.836 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:04:31.208 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:04:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:04:31.208 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:04:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:04:31.209 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:04:31 compute-0 openstack_network_exporter[199731]: ERROR   20:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:04:31 compute-0 openstack_network_exporter[199731]: ERROR   20:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:04:31 compute-0 openstack_network_exporter[199731]: ERROR   20:04:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:04:31 compute-0 openstack_network_exporter[199731]: ERROR   20:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:04:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:04:31 compute-0 openstack_network_exporter[199731]: ERROR   20:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:04:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:04:34 compute-0 podman[225633]: 2025-11-25 20:04:34.213220427 +0000 UTC m=+0.124020565 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Nov 25 20:04:35 compute-0 nova_compute[187212]: 2025-11-25 20:04:35.555 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:35 compute-0 nova_compute[187212]: 2025-11-25 20:04:35.839 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:35 compute-0 ovn_controller[95465]: 2025-11-25T20:04:35Z|00181|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Nov 25 20:04:38 compute-0 podman[225659]: 2025-11-25 20:04:38.156322909 +0000 UTC m=+0.077212074 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 25 20:04:40 compute-0 nova_compute[187212]: 2025-11-25 20:04:40.560 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:40 compute-0 nova_compute[187212]: 2025-11-25 20:04:40.841 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:43 compute-0 podman[225678]: 2025-11-25 20:04:43.172898994 +0000 UTC m=+0.089657592 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64)
Nov 25 20:04:45 compute-0 nova_compute[187212]: 2025-11-25 20:04:45.562 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:45 compute-0 nova_compute[187212]: 2025-11-25 20:04:45.843 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:46 compute-0 podman[225699]: 2025-11-25 20:04:46.151391887 +0000 UTC m=+0.078378255 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 25 20:04:50 compute-0 nova_compute[187212]: 2025-11-25 20:04:50.976 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:55 compute-0 nova_compute[187212]: 2025-11-25 20:04:55.568 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:55 compute-0 nova_compute[187212]: 2025-11-25 20:04:55.980 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:04:56 compute-0 podman[225719]: 2025-11-25 20:04:56.194663883 +0000 UTC m=+0.110124631 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 20:04:59 compute-0 podman[197585]: time="2025-11-25T20:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:04:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:04:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3096 "" "Go-http-client/1.1"
Nov 25 20:05:00 compute-0 nova_compute[187212]: 2025-11-25 20:05:00.570 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:01 compute-0 nova_compute[187212]: 2025-11-25 20:05:01.026 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:01 compute-0 sshd-session[225743]: Invalid user dev from 209.38.103.174 port 49800
Nov 25 20:05:01 compute-0 sshd-session[225743]: Connection closed by invalid user dev 209.38.103.174 port 49800 [preauth]
Nov 25 20:05:01 compute-0 openstack_network_exporter[199731]: ERROR   20:05:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:05:01 compute-0 openstack_network_exporter[199731]: ERROR   20:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:05:01 compute-0 openstack_network_exporter[199731]: ERROR   20:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:05:01 compute-0 openstack_network_exporter[199731]: ERROR   20:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:05:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:05:01 compute-0 openstack_network_exporter[199731]: ERROR   20:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:05:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:05:05 compute-0 podman[225745]: 2025-11-25 20:05:05.197531285 +0000 UTC m=+0.121050698 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 25 20:05:05 compute-0 nova_compute[187212]: 2025-11-25 20:05:05.572 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:06 compute-0 nova_compute[187212]: 2025-11-25 20:05:06.027 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:07 compute-0 nova_compute[187212]: 2025-11-25 20:05:07.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:09 compute-0 podman[225772]: 2025-11-25 20:05:09.153914576 +0000 UTC m=+0.077930163 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 25 20:05:10 compute-0 nova_compute[187212]: 2025-11-25 20:05:10.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:10 compute-0 nova_compute[187212]: 2025-11-25 20:05:10.173 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:05:10 compute-0 nova_compute[187212]: 2025-11-25 20:05:10.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:10 compute-0 nova_compute[187212]: 2025-11-25 20:05:10.575 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:10 compute-0 nova_compute[187212]: 2025-11-25 20:05:10.725 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:05:10 compute-0 nova_compute[187212]: 2025-11-25 20:05:10.726 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:05:10 compute-0 nova_compute[187212]: 2025-11-25 20:05:10.726 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:05:10 compute-0 nova_compute[187212]: 2025-11-25 20:05:10.726 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:05:11 compute-0 nova_compute[187212]: 2025-11-25 20:05:11.030 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:11 compute-0 nova_compute[187212]: 2025-11-25 20:05:11.772 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:05:11 compute-0 nova_compute[187212]: 2025-11-25 20:05:11.858 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:05:11 compute-0 nova_compute[187212]: 2025-11-25 20:05:11.859 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:05:11 compute-0 nova_compute[187212]: 2025-11-25 20:05:11.945 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:05:12 compute-0 nova_compute[187212]: 2025-11-25 20:05:12.193 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:05:12 compute-0 nova_compute[187212]: 2025-11-25 20:05:12.195 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:05:12 compute-0 nova_compute[187212]: 2025-11-25 20:05:12.220 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:05:12 compute-0 nova_compute[187212]: 2025-11-25 20:05:12.221 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5608MB free_disk=72.9632453918457GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:05:12 compute-0 nova_compute[187212]: 2025-11-25 20:05:12.221 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:05:12 compute-0 nova_compute[187212]: 2025-11-25 20:05:12.222 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:05:13 compute-0 nova_compute[187212]: 2025-11-25 20:05:13.786 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:05:13 compute-0 nova_compute[187212]: 2025-11-25 20:05:13.787 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:05:13 compute-0 nova_compute[187212]: 2025-11-25 20:05:13.787 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:05:12 up  1:57,  0 user,  load average: 0.00, 0.02, 0.07\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:05:13 compute-0 nova_compute[187212]: 2025-11-25 20:05:13.904 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:05:14 compute-0 podman[225799]: 2025-11-25 20:05:14.184491979 +0000 UTC m=+0.100236420 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 25 20:05:14 compute-0 nova_compute[187212]: 2025-11-25 20:05:14.412 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:05:14 compute-0 nova_compute[187212]: 2025-11-25 20:05:14.924 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:05:14 compute-0 nova_compute[187212]: 2025-11-25 20:05:14.925 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.703s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:05:14 compute-0 nova_compute[187212]: 2025-11-25 20:05:14.925 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:14 compute-0 nova_compute[187212]: 2025-11-25 20:05:14.925 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Nov 25 20:05:15 compute-0 nova_compute[187212]: 2025-11-25 20:05:15.577 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:16 compute-0 nova_compute[187212]: 2025-11-25 20:05:16.033 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:17 compute-0 podman[225822]: 2025-11-25 20:05:17.180035401 +0000 UTC m=+0.098692879 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 20:05:18 compute-0 nova_compute[187212]: 2025-11-25 20:05:18.432 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:18 compute-0 nova_compute[187212]: 2025-11-25 20:05:18.433 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:18 compute-0 nova_compute[187212]: 2025-11-25 20:05:18.433 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:18 compute-0 nova_compute[187212]: 2025-11-25 20:05:18.434 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:19 compute-0 nova_compute[187212]: 2025-11-25 20:05:19.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:20 compute-0 nova_compute[187212]: 2025-11-25 20:05:20.579 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:21 compute-0 nova_compute[187212]: 2025-11-25 20:05:21.038 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:25 compute-0 nova_compute[187212]: 2025-11-25 20:05:25.581 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:26 compute-0 nova_compute[187212]: 2025-11-25 20:05:26.042 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:27 compute-0 podman[225844]: 2025-11-25 20:05:27.168152722 +0000 UTC m=+0.082355369 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 25 20:05:29 compute-0 podman[197585]: time="2025-11-25T20:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:05:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:05:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3096 "" "Go-http-client/1.1"
Nov 25 20:05:30 compute-0 nova_compute[187212]: 2025-11-25 20:05:30.582 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:31 compute-0 nova_compute[187212]: 2025-11-25 20:05:31.046 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:05:31.211 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:05:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:05:31.211 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:05:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:05:31.212 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:05:31 compute-0 openstack_network_exporter[199731]: ERROR   20:05:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:05:31 compute-0 openstack_network_exporter[199731]: ERROR   20:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:05:31 compute-0 openstack_network_exporter[199731]: ERROR   20:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:05:31 compute-0 openstack_network_exporter[199731]: ERROR   20:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:05:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:05:31 compute-0 openstack_network_exporter[199731]: ERROR   20:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:05:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:05:35 compute-0 nova_compute[187212]: 2025-11-25 20:05:35.584 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:36 compute-0 nova_compute[187212]: 2025-11-25 20:05:36.049 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:36 compute-0 podman[225869]: 2025-11-25 20:05:36.233584082 +0000 UTC m=+0.150636037 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Nov 25 20:05:38 compute-0 nova_compute[187212]: 2025-11-25 20:05:38.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:40 compute-0 podman[225895]: 2025-11-25 20:05:40.221808411 +0000 UTC m=+0.140671035 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 20:05:40 compute-0 nova_compute[187212]: 2025-11-25 20:05:40.588 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:40 compute-0 sshd-session[225916]: Invalid user dev from 209.38.103.174 port 46186
Nov 25 20:05:40 compute-0 sshd-session[225916]: Connection closed by invalid user dev 209.38.103.174 port 46186 [preauth]
Nov 25 20:05:41 compute-0 nova_compute[187212]: 2025-11-25 20:05:41.052 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:45 compute-0 podman[225918]: 2025-11-25 20:05:45.176812423 +0000 UTC m=+0.088520531 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 20:05:45 compute-0 nova_compute[187212]: 2025-11-25 20:05:45.589 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:45 compute-0 nova_compute[187212]: 2025-11-25 20:05:45.682 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:05:45 compute-0 nova_compute[187212]: 2025-11-25 20:05:45.682 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Nov 25 20:05:46 compute-0 nova_compute[187212]: 2025-11-25 20:05:46.054 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:46 compute-0 nova_compute[187212]: 2025-11-25 20:05:46.189 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Nov 25 20:05:48 compute-0 podman[225940]: 2025-11-25 20:05:48.142076797 +0000 UTC m=+0.066973604 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 20:05:51 compute-0 nova_compute[187212]: 2025-11-25 20:05:51.005 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:51 compute-0 nova_compute[187212]: 2025-11-25 20:05:51.055 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:56 compute-0 nova_compute[187212]: 2025-11-25 20:05:56.055 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:56 compute-0 nova_compute[187212]: 2025-11-25 20:05:56.058 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:05:58 compute-0 podman[225960]: 2025-11-25 20:05:58.178354829 +0000 UTC m=+0.088196173 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 20:05:59 compute-0 podman[197585]: time="2025-11-25T20:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:05:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:05:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3097 "" "Go-http-client/1.1"
Nov 25 20:06:01 compute-0 nova_compute[187212]: 2025-11-25 20:06:01.056 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:01 compute-0 nova_compute[187212]: 2025-11-25 20:06:01.060 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:01 compute-0 openstack_network_exporter[199731]: ERROR   20:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:06:01 compute-0 openstack_network_exporter[199731]: ERROR   20:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:06:01 compute-0 openstack_network_exporter[199731]: ERROR   20:06:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:06:01 compute-0 openstack_network_exporter[199731]: ERROR   20:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:06:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:06:01 compute-0 openstack_network_exporter[199731]: ERROR   20:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:06:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:06:06 compute-0 nova_compute[187212]: 2025-11-25 20:06:06.062 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:06:06 compute-0 nova_compute[187212]: 2025-11-25 20:06:06.064 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:06:06 compute-0 nova_compute[187212]: 2025-11-25 20:06:06.064 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:06:06 compute-0 nova_compute[187212]: 2025-11-25 20:06:06.064 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:06:06 compute-0 nova_compute[187212]: 2025-11-25 20:06:06.098 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:06 compute-0 nova_compute[187212]: 2025-11-25 20:06:06.099 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:06:07 compute-0 podman[225987]: 2025-11-25 20:06:07.292117209 +0000 UTC m=+0.202772609 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4)
Nov 25 20:06:09 compute-0 nova_compute[187212]: 2025-11-25 20:06:09.681 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:06:11 compute-0 nova_compute[187212]: 2025-11-25 20:06:11.100 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:06:11 compute-0 podman[226013]: 2025-11-25 20:06:11.171510152 +0000 UTC m=+0.086450397 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 25 20:06:12 compute-0 nova_compute[187212]: 2025-11-25 20:06:12.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:06:12 compute-0 nova_compute[187212]: 2025-11-25 20:06:12.174 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:06:12 compute-0 nova_compute[187212]: 2025-11-25 20:06:12.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:06:12 compute-0 nova_compute[187212]: 2025-11-25 20:06:12.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:06:12 compute-0 nova_compute[187212]: 2025-11-25 20:06:12.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:06:12 compute-0 nova_compute[187212]: 2025-11-25 20:06:12.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:06:12 compute-0 nova_compute[187212]: 2025-11-25 20:06:12.694 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:06:13 compute-0 nova_compute[187212]: 2025-11-25 20:06:13.742 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:06:13 compute-0 nova_compute[187212]: 2025-11-25 20:06:13.817 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:06:13 compute-0 nova_compute[187212]: 2025-11-25 20:06:13.819 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:06:13 compute-0 nova_compute[187212]: 2025-11-25 20:06:13.904 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:06:14 compute-0 nova_compute[187212]: 2025-11-25 20:06:14.136 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:06:14 compute-0 nova_compute[187212]: 2025-11-25 20:06:14.139 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:06:14 compute-0 nova_compute[187212]: 2025-11-25 20:06:14.170 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:06:14 compute-0 nova_compute[187212]: 2025-11-25 20:06:14.172 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5611MB free_disk=72.96324157714844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:06:14 compute-0 nova_compute[187212]: 2025-11-25 20:06:14.172 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:06:14 compute-0 nova_compute[187212]: 2025-11-25 20:06:14.173 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:06:15 compute-0 nova_compute[187212]: 2025-11-25 20:06:15.731 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:06:15 compute-0 nova_compute[187212]: 2025-11-25 20:06:15.732 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:06:15 compute-0 nova_compute[187212]: 2025-11-25 20:06:15.733 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:06:14 up  1:58,  0 user,  load average: 0.08, 0.03, 0.07\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:06:15 compute-0 nova_compute[187212]: 2025-11-25 20:06:15.750 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing inventories for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Nov 25 20:06:15 compute-0 nova_compute[187212]: 2025-11-25 20:06:15.764 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating ProviderTree inventory for provider bd855788-e41f-445a-8ef6-eb363fed2f12 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Nov 25 20:06:15 compute-0 nova_compute[187212]: 2025-11-25 20:06:15.765 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Updating inventory in ProviderTree for provider bd855788-e41f-445a-8ef6-eb363fed2f12 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Nov 25 20:06:15 compute-0 nova_compute[187212]: 2025-11-25 20:06:15.784 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing aggregate associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Nov 25 20:06:15 compute-0 nova_compute[187212]: 2025-11-25 20:06:15.815 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Refreshing trait associations for resource provider bd855788-e41f-445a-8ef6-eb363fed2f12, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Nov 25 20:06:15 compute-0 nova_compute[187212]: 2025-11-25 20:06:15.870 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:06:16 compute-0 nova_compute[187212]: 2025-11-25 20:06:16.101 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:16 compute-0 podman[226041]: 2025-11-25 20:06:16.176879801 +0000 UTC m=+0.091219142 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 20:06:16 compute-0 nova_compute[187212]: 2025-11-25 20:06:16.379 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:06:16 compute-0 nova_compute[187212]: 2025-11-25 20:06:16.892 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:06:16 compute-0 nova_compute[187212]: 2025-11-25 20:06:16.893 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.720s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:06:17 compute-0 nova_compute[187212]: 2025-11-25 20:06:17.893 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:06:17 compute-0 nova_compute[187212]: 2025-11-25 20:06:17.894 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:06:18 compute-0 nova_compute[187212]: 2025-11-25 20:06:18.406 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:06:18 compute-0 nova_compute[187212]: 2025-11-25 20:06:18.407 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:06:19 compute-0 podman[226063]: 2025-11-25 20:06:19.168907281 +0000 UTC m=+0.092599230 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Nov 25 20:06:19 compute-0 nova_compute[187212]: 2025-11-25 20:06:19.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:06:19 compute-0 sshd-session[226084]: Invalid user dev from 209.38.103.174 port 46064
Nov 25 20:06:19 compute-0 sshd-session[226084]: Connection closed by invalid user dev 209.38.103.174 port 46064 [preauth]
Nov 25 20:06:20 compute-0 nova_compute[187212]: 2025-11-25 20:06:20.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:06:21 compute-0 nova_compute[187212]: 2025-11-25 20:06:21.105 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:26 compute-0 nova_compute[187212]: 2025-11-25 20:06:26.107 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:06:29 compute-0 podman[226086]: 2025-11-25 20:06:29.170443177 +0000 UTC m=+0.085245645 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 20:06:29 compute-0 podman[197585]: time="2025-11-25T20:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:06:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:06:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3089 "" "Go-http-client/1.1"
Nov 25 20:06:31 compute-0 nova_compute[187212]: 2025-11-25 20:06:31.108 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:06:31 compute-0 nova_compute[187212]: 2025-11-25 20:06:31.110 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:31 compute-0 nova_compute[187212]: 2025-11-25 20:06:31.110 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:06:31 compute-0 nova_compute[187212]: 2025-11-25 20:06:31.110 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:06:31 compute-0 nova_compute[187212]: 2025-11-25 20:06:31.111 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:06:31 compute-0 nova_compute[187212]: 2025-11-25 20:06:31.112 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:06:31.213 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:06:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:06:31.215 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:06:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:06:31.216 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:06:31 compute-0 openstack_network_exporter[199731]: ERROR   20:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:06:31 compute-0 openstack_network_exporter[199731]: ERROR   20:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:06:31 compute-0 openstack_network_exporter[199731]: ERROR   20:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:06:31 compute-0 openstack_network_exporter[199731]: ERROR   20:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:06:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:06:31 compute-0 openstack_network_exporter[199731]: ERROR   20:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:06:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:06:36 compute-0 nova_compute[187212]: 2025-11-25 20:06:36.112 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:38 compute-0 podman[226110]: 2025-11-25 20:06:38.193141918 +0000 UTC m=+0.124508399 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 20:06:41 compute-0 nova_compute[187212]: 2025-11-25 20:06:41.115 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:42 compute-0 podman[226137]: 2025-11-25 20:06:42.151972093 +0000 UTC m=+0.077616564 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 25 20:06:46 compute-0 nova_compute[187212]: 2025-11-25 20:06:46.119 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:47 compute-0 podman[226158]: 2025-11-25 20:06:47.15537283 +0000 UTC m=+0.080130970 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 25 20:06:50 compute-0 podman[226181]: 2025-11-25 20:06:50.164434069 +0000 UTC m=+0.083111859 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 25 20:06:51 compute-0 nova_compute[187212]: 2025-11-25 20:06:51.121 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:51 compute-0 nova_compute[187212]: 2025-11-25 20:06:51.130 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:56 compute-0 nova_compute[187212]: 2025-11-25 20:06:56.124 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:56 compute-0 nova_compute[187212]: 2025-11-25 20:06:56.169 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:06:57 compute-0 sshd-session[226201]: Invalid user dev from 209.38.103.174 port 50172
Nov 25 20:06:57 compute-0 sshd-session[226201]: Connection closed by invalid user dev 209.38.103.174 port 50172 [preauth]
Nov 25 20:06:59 compute-0 podman[197585]: time="2025-11-25T20:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:06:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:06:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3098 "" "Go-http-client/1.1"
Nov 25 20:07:00 compute-0 podman[226203]: 2025-11-25 20:07:00.174653624 +0000 UTC m=+0.094268423 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 25 20:07:01 compute-0 nova_compute[187212]: 2025-11-25 20:07:01.126 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:01 compute-0 nova_compute[187212]: 2025-11-25 20:07:01.170 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:01 compute-0 openstack_network_exporter[199731]: ERROR   20:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:07:01 compute-0 openstack_network_exporter[199731]: ERROR   20:07:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:07:01 compute-0 openstack_network_exporter[199731]: ERROR   20:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:07:01 compute-0 openstack_network_exporter[199731]: ERROR   20:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:07:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:07:01 compute-0 openstack_network_exporter[199731]: ERROR   20:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:07:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:07:06 compute-0 nova_compute[187212]: 2025-11-25 20:07:06.130 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:06 compute-0 nova_compute[187212]: 2025-11-25 20:07:06.172 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:09 compute-0 podman[226229]: 2025-11-25 20:07:09.222698597 +0000 UTC m=+0.139522275 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Nov 25 20:07:10 compute-0 nova_compute[187212]: 2025-11-25 20:07:10.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:07:11 compute-0 nova_compute[187212]: 2025-11-25 20:07:11.131 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:11 compute-0 nova_compute[187212]: 2025-11-25 20:07:11.174 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:13 compute-0 nova_compute[187212]: 2025-11-25 20:07:13.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:07:13 compute-0 nova_compute[187212]: 2025-11-25 20:07:13.175 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:07:13 compute-0 podman[226255]: 2025-11-25 20:07:13.196161517 +0000 UTC m=+0.106038733 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 25 20:07:13 compute-0 nova_compute[187212]: 2025-11-25 20:07:13.693 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:07:13 compute-0 nova_compute[187212]: 2025-11-25 20:07:13.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:07:13 compute-0 nova_compute[187212]: 2025-11-25 20:07:13.694 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:07:13 compute-0 nova_compute[187212]: 2025-11-25 20:07:13.694 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:07:14 compute-0 nova_compute[187212]: 2025-11-25 20:07:14.741 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:07:14 compute-0 nova_compute[187212]: 2025-11-25 20:07:14.832 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:07:14 compute-0 nova_compute[187212]: 2025-11-25 20:07:14.834 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:07:14 compute-0 nova_compute[187212]: 2025-11-25 20:07:14.920 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:07:15 compute-0 nova_compute[187212]: 2025-11-25 20:07:15.178 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:07:15 compute-0 nova_compute[187212]: 2025-11-25 20:07:15.180 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:07:15 compute-0 nova_compute[187212]: 2025-11-25 20:07:15.224 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:07:15 compute-0 nova_compute[187212]: 2025-11-25 20:07:15.225 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5581MB free_disk=72.96319580078125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:07:15 compute-0 nova_compute[187212]: 2025-11-25 20:07:15.225 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:07:15 compute-0 nova_compute[187212]: 2025-11-25 20:07:15.226 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.175 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.178 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.178 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.179 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.180 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.181 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.818 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.818 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.819 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:07:15 up  1:59,  0 user,  load average: 0.23, 0.07, 0.08\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:07:16 compute-0 nova_compute[187212]: 2025-11-25 20:07:16.860 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:07:17 compute-0 nova_compute[187212]: 2025-11-25 20:07:17.367 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:07:17 compute-0 nova_compute[187212]: 2025-11-25 20:07:17.880 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:07:17 compute-0 nova_compute[187212]: 2025-11-25 20:07:17.880 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.655s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:07:18 compute-0 podman[226281]: 2025-11-25 20:07:18.17584753 +0000 UTC m=+0.098856184 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, release=1755695350)
Nov 25 20:07:18 compute-0 nova_compute[187212]: 2025-11-25 20:07:18.879 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:07:18 compute-0 nova_compute[187212]: 2025-11-25 20:07:18.879 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:07:18 compute-0 nova_compute[187212]: 2025-11-25 20:07:18.880 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:07:19 compute-0 nova_compute[187212]: 2025-11-25 20:07:19.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:07:21 compute-0 nova_compute[187212]: 2025-11-25 20:07:21.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:07:21 compute-0 nova_compute[187212]: 2025-11-25 20:07:21.182 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:21 compute-0 nova_compute[187212]: 2025-11-25 20:07:21.184 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:21 compute-0 podman[226303]: 2025-11-25 20:07:21.192030754 +0000 UTC m=+0.104578244 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 20:07:22 compute-0 nova_compute[187212]: 2025-11-25 20:07:22.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:07:26 compute-0 nova_compute[187212]: 2025-11-25 20:07:26.185 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:26 compute-0 nova_compute[187212]: 2025-11-25 20:07:26.190 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:29 compute-0 podman[197585]: time="2025-11-25T20:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:07:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:07:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3092 "" "Go-http-client/1.1"
Nov 25 20:07:31 compute-0 podman[226323]: 2025-11-25 20:07:31.158161208 +0000 UTC m=+0.072124050 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 25 20:07:31 compute-0 nova_compute[187212]: 2025-11-25 20:07:31.189 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:07:31.218 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:07:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:07:31.218 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:07:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:07:31.219 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:07:31 compute-0 openstack_network_exporter[199731]: ERROR   20:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:07:31 compute-0 openstack_network_exporter[199731]: ERROR   20:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:07:31 compute-0 openstack_network_exporter[199731]: ERROR   20:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:07:31 compute-0 openstack_network_exporter[199731]: ERROR   20:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:07:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:07:31 compute-0 openstack_network_exporter[199731]: ERROR   20:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:07:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:07:35 compute-0 sshd-session[226348]: Invalid user dev from 209.38.103.174 port 32898
Nov 25 20:07:35 compute-0 sshd-session[226348]: Connection closed by invalid user dev 209.38.103.174 port 32898 [preauth]
Nov 25 20:07:36 compute-0 nova_compute[187212]: 2025-11-25 20:07:36.194 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:36 compute-0 nova_compute[187212]: 2025-11-25 20:07:36.195 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:36 compute-0 nova_compute[187212]: 2025-11-25 20:07:36.195 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:07:36 compute-0 nova_compute[187212]: 2025-11-25 20:07:36.195 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:36 compute-0 nova_compute[187212]: 2025-11-25 20:07:36.196 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:36 compute-0 nova_compute[187212]: 2025-11-25 20:07:36.197 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:40 compute-0 podman[226350]: 2025-11-25 20:07:40.236378215 +0000 UTC m=+0.150393931 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Nov 25 20:07:41 compute-0 nova_compute[187212]: 2025-11-25 20:07:41.198 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:41 compute-0 nova_compute[187212]: 2025-11-25 20:07:41.199 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:41 compute-0 nova_compute[187212]: 2025-11-25 20:07:41.199 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:07:41 compute-0 nova_compute[187212]: 2025-11-25 20:07:41.199 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:41 compute-0 nova_compute[187212]: 2025-11-25 20:07:41.200 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:41 compute-0 nova_compute[187212]: 2025-11-25 20:07:41.201 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:44 compute-0 podman[226377]: 2025-11-25 20:07:44.185287058 +0000 UTC m=+0.090791402 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 25 20:07:46 compute-0 nova_compute[187212]: 2025-11-25 20:07:46.203 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:46 compute-0 nova_compute[187212]: 2025-11-25 20:07:46.205 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:46 compute-0 nova_compute[187212]: 2025-11-25 20:07:46.205 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:07:46 compute-0 nova_compute[187212]: 2025-11-25 20:07:46.205 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:46 compute-0 nova_compute[187212]: 2025-11-25 20:07:46.238 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:46 compute-0 nova_compute[187212]: 2025-11-25 20:07:46.239 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:46 compute-0 nova_compute[187212]: 2025-11-25 20:07:46.240 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:49 compute-0 podman[226397]: 2025-11-25 20:07:49.161299114 +0000 UTC m=+0.075318324 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 20:07:51 compute-0 nova_compute[187212]: 2025-11-25 20:07:51.240 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:52 compute-0 podman[226418]: 2025-11-25 20:07:52.167967969 +0000 UTC m=+0.079623117 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 20:07:56 compute-0 nova_compute[187212]: 2025-11-25 20:07:56.244 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:56 compute-0 nova_compute[187212]: 2025-11-25 20:07:56.249 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:07:56 compute-0 nova_compute[187212]: 2025-11-25 20:07:56.249 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:07:56 compute-0 nova_compute[187212]: 2025-11-25 20:07:56.249 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:56 compute-0 nova_compute[187212]: 2025-11-25 20:07:56.277 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:07:56 compute-0 nova_compute[187212]: 2025-11-25 20:07:56.277 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:07:59 compute-0 podman[197585]: time="2025-11-25T20:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:07:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:07:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3093 "" "Go-http-client/1.1"
Nov 25 20:08:01 compute-0 nova_compute[187212]: 2025-11-25 20:08:01.278 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:01 compute-0 nova_compute[187212]: 2025-11-25 20:08:01.279 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:01 compute-0 openstack_network_exporter[199731]: ERROR   20:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:08:01 compute-0 openstack_network_exporter[199731]: ERROR   20:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:08:01 compute-0 openstack_network_exporter[199731]: ERROR   20:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:08:01 compute-0 openstack_network_exporter[199731]: ERROR   20:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:08:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:08:01 compute-0 openstack_network_exporter[199731]: ERROR   20:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:08:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:08:02 compute-0 podman[226438]: 2025-11-25 20:08:02.174075477 +0000 UTC m=+0.078962660 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 20:08:06 compute-0 nova_compute[187212]: 2025-11-25 20:08:06.280 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:06 compute-0 nova_compute[187212]: 2025-11-25 20:08:06.283 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:06 compute-0 nova_compute[187212]: 2025-11-25 20:08:06.283 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:08:06 compute-0 nova_compute[187212]: 2025-11-25 20:08:06.283 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:08:06 compute-0 nova_compute[187212]: 2025-11-25 20:08:06.292 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:06 compute-0 nova_compute[187212]: 2025-11-25 20:08:06.293 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:08:11 compute-0 podman[226462]: 2025-11-25 20:08:11.282658273 +0000 UTC m=+0.197381418 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 20:08:11 compute-0 nova_compute[187212]: 2025-11-25 20:08:11.293 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:11 compute-0 nova_compute[187212]: 2025-11-25 20:08:11.295 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:12 compute-0 nova_compute[187212]: 2025-11-25 20:08:12.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:08:12 compute-0 sshd-session[226489]: Invalid user dev from 209.38.103.174 port 52634
Nov 25 20:08:12 compute-0 sshd-session[226489]: Connection closed by invalid user dev 209.38.103.174 port 52634 [preauth]
Nov 25 20:08:13 compute-0 nova_compute[187212]: 2025-11-25 20:08:13.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:08:13 compute-0 nova_compute[187212]: 2025-11-25 20:08:13.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:08:13 compute-0 nova_compute[187212]: 2025-11-25 20:08:13.690 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:08:13 compute-0 nova_compute[187212]: 2025-11-25 20:08:13.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:08:13 compute-0 nova_compute[187212]: 2025-11-25 20:08:13.691 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:08:13 compute-0 nova_compute[187212]: 2025-11-25 20:08:13.692 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:08:14 compute-0 nova_compute[187212]: 2025-11-25 20:08:14.741 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:08:14 compute-0 nova_compute[187212]: 2025-11-25 20:08:14.832 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:08:14 compute-0 nova_compute[187212]: 2025-11-25 20:08:14.833 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:08:14 compute-0 nova_compute[187212]: 2025-11-25 20:08:14.904 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:08:15 compute-0 nova_compute[187212]: 2025-11-25 20:08:15.147 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:08:15 compute-0 nova_compute[187212]: 2025-11-25 20:08:15.149 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:08:15 compute-0 nova_compute[187212]: 2025-11-25 20:08:15.181 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:08:15 compute-0 nova_compute[187212]: 2025-11-25 20:08:15.183 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5587MB free_disk=72.9626693725586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:08:15 compute-0 nova_compute[187212]: 2025-11-25 20:08:15.183 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:08:15 compute-0 nova_compute[187212]: 2025-11-25 20:08:15.184 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:08:15 compute-0 podman[226497]: 2025-11-25 20:08:15.202857461 +0000 UTC m=+0.114691571 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.332 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.334 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.335 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5039 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.335 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.335 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.337 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.774 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.775 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.776 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:08:15 up  2:00,  0 user,  load average: 0.15, 0.08, 0.08\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:08:16 compute-0 nova_compute[187212]: 2025-11-25 20:08:16.911 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:08:17 compute-0 nova_compute[187212]: 2025-11-25 20:08:17.419 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:08:17 compute-0 nova_compute[187212]: 2025-11-25 20:08:17.934 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:08:17 compute-0 nova_compute[187212]: 2025-11-25 20:08:17.934 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.750s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:08:18 compute-0 sshd-session[226517]: Received disconnect from 111.61.229.78 port 40726:11:  [preauth]
Nov 25 20:08:18 compute-0 sshd-session[226517]: Disconnected from authenticating user root 111.61.229.78 port 40726 [preauth]
Nov 25 20:08:19 compute-0 nova_compute[187212]: 2025-11-25 20:08:19.934 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:08:19 compute-0 nova_compute[187212]: 2025-11-25 20:08:19.935 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:08:20 compute-0 podman[226519]: 2025-11-25 20:08:20.164541549 +0000 UTC m=+0.073554537 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Nov 25 20:08:20 compute-0 nova_compute[187212]: 2025-11-25 20:08:20.448 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:08:20 compute-0 nova_compute[187212]: 2025-11-25 20:08:20.449 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:08:20 compute-0 nova_compute[187212]: 2025-11-25 20:08:20.449 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:08:21 compute-0 nova_compute[187212]: 2025-11-25 20:08:21.338 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:22 compute-0 nova_compute[187212]: 2025-11-25 20:08:22.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:08:23 compute-0 podman[226541]: 2025-11-25 20:08:23.200401273 +0000 UTC m=+0.111691501 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Nov 25 20:08:24 compute-0 nova_compute[187212]: 2025-11-25 20:08:24.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:08:26 compute-0 nova_compute[187212]: 2025-11-25 20:08:26.341 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:26 compute-0 nova_compute[187212]: 2025-11-25 20:08:26.341 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:26 compute-0 nova_compute[187212]: 2025-11-25 20:08:26.342 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:08:26 compute-0 nova_compute[187212]: 2025-11-25 20:08:26.342 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:08:26 compute-0 nova_compute[187212]: 2025-11-25 20:08:26.342 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:08:26 compute-0 nova_compute[187212]: 2025-11-25 20:08:26.343 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:29 compute-0 podman[197585]: time="2025-11-25T20:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:08:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:08:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3091 "" "Go-http-client/1.1"
Nov 25 20:08:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:08:31.221 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:08:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:08:31.222 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:08:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:08:31.222 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:08:31 compute-0 nova_compute[187212]: 2025-11-25 20:08:31.344 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:31 compute-0 openstack_network_exporter[199731]: ERROR   20:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:08:31 compute-0 openstack_network_exporter[199731]: ERROR   20:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:08:31 compute-0 openstack_network_exporter[199731]: ERROR   20:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:08:31 compute-0 openstack_network_exporter[199731]: ERROR   20:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:08:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:08:31 compute-0 openstack_network_exporter[199731]: ERROR   20:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:08:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:08:33 compute-0 podman[226562]: 2025-11-25 20:08:33.16479094 +0000 UTC m=+0.086203521 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 20:08:36 compute-0 nova_compute[187212]: 2025-11-25 20:08:36.347 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:41 compute-0 nova_compute[187212]: 2025-11-25 20:08:41.349 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:41 compute-0 nova_compute[187212]: 2025-11-25 20:08:41.352 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:41 compute-0 nova_compute[187212]: 2025-11-25 20:08:41.353 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:08:41 compute-0 nova_compute[187212]: 2025-11-25 20:08:41.353 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:08:41 compute-0 nova_compute[187212]: 2025-11-25 20:08:41.397 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:41 compute-0 nova_compute[187212]: 2025-11-25 20:08:41.399 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:08:42 compute-0 podman[226587]: 2025-11-25 20:08:42.165438793 +0000 UTC m=+0.086189451 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 25 20:08:46 compute-0 podman[226614]: 2025-11-25 20:08:46.16576976 +0000 UTC m=+0.078531849 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Nov 25 20:08:46 compute-0 nova_compute[187212]: 2025-11-25 20:08:46.400 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:08:50 compute-0 sshd-session[226634]: Invalid user dev from 209.38.103.174 port 46166
Nov 25 20:08:50 compute-0 sshd-session[226634]: Connection closed by invalid user dev 209.38.103.174 port 46166 [preauth]
Nov 25 20:08:50 compute-0 podman[226636]: 2025-11-25 20:08:50.875740393 +0000 UTC m=+0.128057683 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 25 20:08:51 compute-0 nova_compute[187212]: 2025-11-25 20:08:51.402 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:54 compute-0 podman[226658]: 2025-11-25 20:08:54.144164629 +0000 UTC m=+0.063226006 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 25 20:08:56 compute-0 nova_compute[187212]: 2025-11-25 20:08:56.404 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:08:59 compute-0 podman[197585]: time="2025-11-25T20:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:08:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:08:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3093 "" "Go-http-client/1.1"
Nov 25 20:09:01 compute-0 nova_compute[187212]: 2025-11-25 20:09:01.406 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:09:01 compute-0 nova_compute[187212]: 2025-11-25 20:09:01.407 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:01 compute-0 nova_compute[187212]: 2025-11-25 20:09:01.408 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:09:01 compute-0 nova_compute[187212]: 2025-11-25 20:09:01.408 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:09:01 compute-0 nova_compute[187212]: 2025-11-25 20:09:01.409 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:09:01 compute-0 nova_compute[187212]: 2025-11-25 20:09:01.410 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:01 compute-0 openstack_network_exporter[199731]: ERROR   20:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:09:01 compute-0 openstack_network_exporter[199731]: ERROR   20:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:09:01 compute-0 openstack_network_exporter[199731]: ERROR   20:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:09:01 compute-0 openstack_network_exporter[199731]: ERROR   20:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:09:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:09:01 compute-0 openstack_network_exporter[199731]: ERROR   20:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:09:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:09:04 compute-0 podman[226678]: 2025-11-25 20:09:04.155626794 +0000 UTC m=+0.069434349 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 20:09:06 compute-0 nova_compute[187212]: 2025-11-25 20:09:06.411 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:11 compute-0 nova_compute[187212]: 2025-11-25 20:09:11.413 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:09:13 compute-0 nova_compute[187212]: 2025-11-25 20:09:13.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:09:13 compute-0 podman[226702]: 2025-11-25 20:09:13.395576308 +0000 UTC m=+0.312602101 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Nov 25 20:09:14 compute-0 nova_compute[187212]: 2025-11-25 20:09:14.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:09:14 compute-0 nova_compute[187212]: 2025-11-25 20:09:14.687 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:09:14 compute-0 nova_compute[187212]: 2025-11-25 20:09:14.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:09:14 compute-0 nova_compute[187212]: 2025-11-25 20:09:14.688 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:09:14 compute-0 nova_compute[187212]: 2025-11-25 20:09:14.688 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Nov 25 20:09:15 compute-0 nova_compute[187212]: 2025-11-25 20:09:15.730 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:09:15 compute-0 nova_compute[187212]: 2025-11-25 20:09:15.812 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:09:15 compute-0 nova_compute[187212]: 2025-11-25 20:09:15.814 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:09:15 compute-0 nova_compute[187212]: 2025-11-25 20:09:15.874 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f71d9429-2da3-4b6b-b82d-63027e46f952/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.086 187216 WARNING nova.virt.libvirt.driver [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.087 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.127 187216 DEBUG oslo_concurrency.processutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.128 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5587MB free_disk=72.96270751953125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.128 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.128 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.415 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.417 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.417 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.417 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.418 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:09:16 compute-0 nova_compute[187212]: 2025-11-25 20:09:16.419 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:17 compute-0 podman[226735]: 2025-11-25 20:09:17.181319535 +0000 UTC m=+0.103932817 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 25 20:09:17 compute-0 nova_compute[187212]: 2025-11-25 20:09:17.733 187216 INFO nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Instance 729abee0-8f2c-4b6a-a5ae-17a0bd3e7362 has allocations against this compute host but is not found in the database.
Nov 25 20:09:17 compute-0 nova_compute[187212]: 2025-11-25 20:09:17.734 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Nov 25 20:09:17 compute-0 nova_compute[187212]: 2025-11-25 20:09:17.734 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:09:16 up  2:01,  0 user,  load average: 0.05, 0.07, 0.08\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_e0287f0353d44a63af6cafda5ee0aa0c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Nov 25 20:09:17 compute-0 nova_compute[187212]: 2025-11-25 20:09:17.779 187216 DEBUG nova.compute.provider_tree [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed in ProviderTree for provider: bd855788-e41f-445a-8ef6-eb363fed2f12 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Nov 25 20:09:18 compute-0 nova_compute[187212]: 2025-11-25 20:09:18.291 187216 DEBUG nova.scheduler.client.report [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Inventory has not changed for provider bd855788-e41f-445a-8ef6-eb363fed2f12 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Nov 25 20:09:18 compute-0 nova_compute[187212]: 2025-11-25 20:09:18.809 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Nov 25 20:09:18 compute-0 nova_compute[187212]: 2025-11-25 20:09:18.809 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.681s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:09:19 compute-0 nova_compute[187212]: 2025-11-25 20:09:19.811 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:09:19 compute-0 nova_compute[187212]: 2025-11-25 20:09:19.812 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:09:19 compute-0 nova_compute[187212]: 2025-11-25 20:09:19.812 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:09:19 compute-0 nova_compute[187212]: 2025-11-25 20:09:19.813 187216 DEBUG nova.compute.manager [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Nov 25 20:09:20 compute-0 nova_compute[187212]: 2025-11-25 20:09:20.174 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:09:21 compute-0 podman[226755]: 2025-11-25 20:09:21.184479147 +0000 UTC m=+0.095785483 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal)
Nov 25 20:09:21 compute-0 nova_compute[187212]: 2025-11-25 20:09:21.418 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:23 compute-0 nova_compute[187212]: 2025-11-25 20:09:23.172 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:09:25 compute-0 podman[226777]: 2025-11-25 20:09:25.135056893 +0000 UTC m=+0.062400354 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Nov 25 20:09:25 compute-0 nova_compute[187212]: 2025-11-25 20:09:25.176 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:09:26 compute-0 nova_compute[187212]: 2025-11-25 20:09:26.421 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:29 compute-0 sshd-session[226798]: Invalid user dev from 209.38.103.174 port 53342
Nov 25 20:09:29 compute-0 podman[197585]: time="2025-11-25T20:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:09:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:09:29 compute-0 podman[197585]: @ - - [25/Nov/2025:20:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3090 "" "Go-http-client/1.1"
Nov 25 20:09:29 compute-0 sshd-session[226798]: Connection closed by invalid user dev 209.38.103.174 port 53342 [preauth]
Nov 25 20:09:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:09:31.224 104356 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:09:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:09:31.224 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:09:31 compute-0 ovn_metadata_agent[104351]: 2025-11-25 20:09:31.225 104356 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:09:31 compute-0 openstack_network_exporter[199731]: ERROR   20:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:09:31 compute-0 openstack_network_exporter[199731]: ERROR   20:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:09:31 compute-0 openstack_network_exporter[199731]: ERROR   20:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:09:31 compute-0 openstack_network_exporter[199731]: ERROR   20:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:09:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:09:31 compute-0 openstack_network_exporter[199731]: ERROR   20:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:09:31 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:09:31 compute-0 nova_compute[187212]: 2025-11-25 20:09:31.423 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:09:31 compute-0 nova_compute[187212]: 2025-11-25 20:09:31.424 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:31 compute-0 nova_compute[187212]: 2025-11-25 20:09:31.425 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:09:31 compute-0 nova_compute[187212]: 2025-11-25 20:09:31.425 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:09:31 compute-0 nova_compute[187212]: 2025-11-25 20:09:31.425 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:09:31 compute-0 nova_compute[187212]: 2025-11-25 20:09:31.426 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:35 compute-0 podman[226801]: 2025-11-25 20:09:35.141959341 +0000 UTC m=+0.073795005 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 25 20:09:36 compute-0 nova_compute[187212]: 2025-11-25 20:09:36.427 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:09:36 compute-0 nova_compute[187212]: 2025-11-25 20:09:36.428 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:36 compute-0 nova_compute[187212]: 2025-11-25 20:09:36.429 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Nov 25 20:09:36 compute-0 nova_compute[187212]: 2025-11-25 20:09:36.429 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:09:36 compute-0 nova_compute[187212]: 2025-11-25 20:09:36.430 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Nov 25 20:09:36 compute-0 nova_compute[187212]: 2025-11-25 20:09:36.431 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:41 compute-0 nova_compute[187212]: 2025-11-25 20:09:41.430 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:41 compute-0 nova_compute[187212]: 2025-11-25 20:09:41.432 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:44 compute-0 podman[226826]: 2025-11-25 20:09:44.174228507 +0000 UTC m=+0.094685874 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible)
Nov 25 20:09:46 compute-0 nova_compute[187212]: 2025-11-25 20:09:46.433 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:48 compute-0 podman[226854]: 2025-11-25 20:09:48.156803877 +0000 UTC m=+0.078012995 container health_status 954c5afbfe5ae704b8c7ede179050be3dbcfe93950f35c4a7bdd521d1c491cec (image=38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 25 20:09:51 compute-0 nova_compute[187212]: 2025-11-25 20:09:51.433 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:51 compute-0 nova_compute[187212]: 2025-11-25 20:09:51.435 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:09:52 compute-0 podman[226874]: 2025-11-25 20:09:52.146812473 +0000 UTC m=+0.076164397 container health_status a9d88c9a20e8d876734d7b98a807e4ab100c7b5c45ac89911e9ca0762270062e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 25 20:09:56 compute-0 podman[226895]: 2025-11-25 20:09:56.174610493 +0000 UTC m=+0.094512200 container health_status 1d1af1fd0e5b96605bb0edffe8df4ff291f0d8f48c3b2daeb4bd976a070ea562 (image=38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Nov 25 20:09:56 compute-0 nova_compute[187212]: 2025-11-25 20:09:56.436 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:09:59 compute-0 podman[197585]: time="2025-11-25T20:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 25 20:09:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Nov 25 20:09:59 compute-0 podman[197585]: @ - - [25/Nov/2025:20:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3095 "" "Go-http-client/1.1"
Nov 25 20:10:00 compute-0 sshd-session[226917]: Accepted publickey for zuul from 192.168.122.10 port 41422 ssh2: ECDSA SHA256:Wy+pFN9FEe7/OSx9IarhwObu373pHJY9dBOGDr5K9Zg
Nov 25 20:10:00 compute-0 systemd-logind[820]: New session 40 of user zuul.
Nov 25 20:10:00 compute-0 systemd[1]: Started Session 40 of User zuul.
Nov 25 20:10:00 compute-0 sshd-session[226917]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 25 20:10:00 compute-0 sudo[226921]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 25 20:10:00 compute-0 sudo[226921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 25 20:10:01 compute-0 openstack_network_exporter[199731]: ERROR   20:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 25 20:10:01 compute-0 openstack_network_exporter[199731]: ERROR   20:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:10:01 compute-0 openstack_network_exporter[199731]: ERROR   20:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 25 20:10:01 compute-0 openstack_network_exporter[199731]: ERROR   20:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 25 20:10:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:10:01 compute-0 openstack_network_exporter[199731]: ERROR   20:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 25 20:10:01 compute-0 openstack_network_exporter[199731]: 
Nov 25 20:10:01 compute-0 nova_compute[187212]: 2025-11-25 20:10:01.438 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:10:05 compute-0 ovs-vsctl[227103]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 25 20:10:06 compute-0 podman[227110]: 2025-11-25 20:10:06.195551911 +0000 UTC m=+0.078647472 container health_status e14cd1067dc10ab66f0ef22d4c266138cc6e7fab26b7dafa6ba8341c80fb5b5f (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 25 20:10:06 compute-0 nova_compute[187212]: 2025-11-25 20:10:06.441 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Nov 25 20:10:06 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 226945 (sos)
Nov 25 20:10:06 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 25 20:10:06 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 25 20:10:07 compute-0 virtqemud[186888]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 25 20:10:07 compute-0 virtqemud[186888]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 25 20:10:07 compute-0 virtqemud[186888]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 25 20:10:08 compute-0 sshd-session[227326]: Invalid user dev from 209.38.103.174 port 33736
Nov 25 20:10:08 compute-0 sshd-session[227326]: Connection closed by invalid user dev 209.38.103.174 port 33736 [preauth]
Nov 25 20:10:08 compute-0 crontab[227553]: (root) LIST (root)
Nov 25 20:10:11 compute-0 systemd[1]: Starting Hostname Service...
Nov 25 20:10:11 compute-0 systemd[1]: Started Hostname Service.
Nov 25 20:10:11 compute-0 nova_compute[187212]: 2025-11-25 20:10:11.444 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:10:11 compute-0 nova_compute[187212]: 2025-11-25 20:10:11.448 187216 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Nov 25 20:10:14 compute-0 nova_compute[187212]: 2025-11-25 20:10:14.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:10:15 compute-0 nova_compute[187212]: 2025-11-25 20:10:15.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:10:15 compute-0 nova_compute[187212]: 2025-11-25 20:10:15.173 187216 DEBUG oslo_service.periodic_task [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Nov 25 20:10:15 compute-0 podman[228051]: 2025-11-25 20:10:15.212877963 +0000 UTC m=+0.126239274 container health_status 8c19205969df21588b78ad1e3caf60251a827e5f206107c863d56003bc08fc84 (image=38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.27:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 25 20:10:15 compute-0 nova_compute[187212]: 2025-11-25 20:10:15.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Nov 25 20:10:15 compute-0 nova_compute[187212]: 2025-11-25 20:10:15.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Nov 25 20:10:15 compute-0 nova_compute[187212]: 2025-11-25 20:10:15.686 187216 DEBUG oslo_concurrency.lockutils [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Nov 25 20:10:15 compute-0 nova_compute[187212]: 2025-11-25 20:10:15.687 187216 DEBUG nova.compute.resource_tracker [None req-fabfdbfa-b315-4a44-836d-bfd379f541cc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
